[go: up one dir, main page]

CN113965740A - Projection equipment and control method thereof - Google Patents

Projection equipment and control method thereof Download PDF

Info

Publication number
CN113965740A
CN113965740A CN202111250949.3A CN202111250949A CN113965740A CN 113965740 A CN113965740 A CN 113965740A CN 202111250949 A CN202111250949 A CN 202111250949A CN 113965740 A CN113965740 A CN 113965740A
Authority
CN
China
Prior art keywords
projection
distance
initial
acceleration
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111250949.3A
Other languages
Chinese (zh)
Inventor
陈许
张冬冬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Hisense Laser Display Co Ltd
Original Assignee
Qingdao Hisense Laser Display Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Hisense Laser Display Co Ltd filed Critical Qingdao Hisense Laser Display Co Ltd
Priority to CN202111250949.3A priority Critical patent/CN113965740A/en
Publication of CN113965740A publication Critical patent/CN113965740A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)

Abstract

The embodiment of the application provides projection equipment and a control method thereof, relates to the technical field of projection, and is used for automatically correcting a to-be-projected image after the projection equipment moves so as to avoid mismatching of the to-be-projected image and a projection screen and guarantee the watching experience of a user. The projection device includes: the projection assembly is used for projecting an image to be projected onto a projection screen; the first distance measuring sensor is used for detecting a first distance between the first distance measuring sensor and the projection screen; the second distance measuring sensor is used for detecting a second distance between the second distance measuring sensor and the projection screen; a controller connected to the projection assembly, the first ranging sensor, and the second ranging sensor, respectively, the controller configured to: acquiring a first distance and a second distance; when the first distance is different from the initial distance and/or the second distance is different from the initial distance, the image to be projected is corrected.

Description

Projection equipment and control method thereof
Technical Field
The present application relates to the field of projection technologies, and in particular, to a projection device and a control method thereof.
Background
With the development of electronic technology, projection equipment is more and more widely applied, and the main working scenes of the projection equipment are teaching, demonstration, entertainment, work and the like. In the course of the daily use of a projection device, the projection device usually projects a projection image onto a projection screen that matches the projection device. When the projection equipment moves, the projection picture can change, so that the projection image is not matched with the projection screen, and the viewing effect is influenced. Therefore, the existing projection equipment has a geometric correction function, and can perform geometric correction on a projection image when the projection image is deviated, so that the projection effect is ensured.
However, with existing projection devices, a user may input an instruction via a remote control instructing the projection device to start geometric correction, or may input a voice instruction to the projection device to trigger its geometric correction function. The triggering mode of the correction method needs human participation and is relatively complicated.
Disclosure of Invention
The embodiment of the application provides a projection device and a control method thereof, which can automatically correct a to-be-projected image after the projection device moves.
In a first aspect, an embodiment of the present application provides a projection apparatus, including: the projection assembly is used for projecting an image to be projected onto a projection screen; the first distance measuring sensor is used for detecting a first distance between the first distance measuring sensor and the projection screen; the second distance measuring sensor is used for detecting a second distance between the second distance measuring sensor and the projection screen; a controller connected to the projection assembly, the first ranging sensor, and the second ranging sensor, respectively, the controller configured to: acquiring a first distance and a second distance; when the first distance is different from the initial distance and/or the second distance is different from the initial distance, correcting the image to be projected; the initial distance is a distance between the projection screen and the first ranging sensor when the projection device is in an initial posture, and the initial distance is a distance between the projection screen and the second ranging sensor when the projection device is in the initial posture.
Based on the above technical solution, a first distance measuring sensor and a second distance measuring sensor are arranged on the side of the projection device near the light outlet, and the first distance measuring sensor is used for detecting a first distance between the first distance measuring sensor and the projection screen, and the second distance measuring sensor is used for detecting a second distance between the second distance measuring sensor and the projection screen, so that when the first distance is different from an initial distance, and/or the second distance is different from the initial distance, the controller of the projection device can determine that the projection device has deflected relative to the projection screen. Under the condition, the controller of the projection equipment can automatically trigger the correction of the image to be projected without manually correcting by a user, so that the good watching experience of the user is ensured, and the operation of the user is reduced.
In a second aspect, there is provided a projection apparatus, comprising: the projection assembly is used for projecting an image to be projected onto a projection screen; a motion sensor for detecting an acceleration of the projection device; a controller connected to the motion sensor and the projection assembly, respectively, the controller configured to: acquiring the acceleration of the projection equipment; and when the acceleration of the projection equipment is different from the initial acceleration, correcting the image to be projected, wherein the initial acceleration is the acceleration detected by the motion sensor when the projection equipment is in the initial posture.
Based on the technical scheme, the projection equipment is provided with the motion sensor, so that the controller of the projection equipment can determine whether the projection equipment moves relative to the projection screen or not based on the acceleration detected by the motion sensor. And under the condition that the projection equipment is determined to move relative to the projection screen (namely the acceleration of the projection equipment is different from the initial acceleration), the controller automatically triggers the correction of the image to be projected without manually correcting by a user, so that the good watching experience of the user is ensured, and the operation of the user is reduced.
In a third aspect, a projection system is provided, comprising: a projection screen and the projection device of any one of the first or second aspects provided above; wherein the projection device is configured to project a projection image on the projection screen.
In a fourth aspect, a method for controlling a projection apparatus is provided, including: acquiring a first distance between the first distance and the projection screen detected by the first distance measuring sensor and a second distance between the second distance and the projection screen detected by the second distance measuring sensor; when the first distance is different from the initial distance and/or the second distance is different from the initial distance, correcting the image to be projected; the initial distance is the distance between the first ranging sensor and the projection screen detected when the projection device is in the initial attitude, and the initial distance is also the distance between the second ranging sensor and the projection screen detected when the projection device is in the initial attitude.
In a fifth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium includes computer instructions, and when the computer instructions are executed on a computer, the computer is caused to execute the method provided in the second aspect and possible implementation manners.
In a sixth aspect, embodiments of the present invention provide a computer program product directly loadable into a memory and containing software code, which when loaded and executed by a computer is able to carry out the method as provided in the second aspect and possible implementations.
For the beneficial effects described in the third aspect to the sixth aspect in the present application, reference may be made to the beneficial effect analysis of the first aspect, and details are not described here.
Drawings
Fig. 1(a) is a schematic view of a projection system according to an embodiment of the present disclosure;
fig. 1(b) is a schematic composition diagram of a projection apparatus provided in an embodiment of the present application;
fig. 2 is a schematic diagram illustrating a ranging sensor according to an embodiment of the present disclosure;
fig. 3 is a flowchart of a control method of a projection apparatus according to an embodiment of the present disclosure;
FIG. 4(a) is a schematic view of another projection system provided in an embodiment of the present application;
FIG. 4(b) is a schematic diagram of another projection apparatus according to an embodiment of the present disclosure
Fig. 5 is a flowchart of another control method for a projection apparatus according to an embodiment of the present disclosure;
FIG. 6(a) is a schematic view of another projection system provided in an embodiment of the present application;
FIG. 6(b) is a schematic diagram of another projection apparatus provided in the embodiments of the present application;
FIG. 7 is a flowchart of a method for correcting a projected image according to an embodiment of the present disclosure;
FIG. 8 is a schematic diagram of a test card according to an embodiment of the present disclosure;
fig. 9 is a schematic view of an application scenario in which a projection system interacts with a control device and a server according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a control device according to an embodiment of the present disclosure;
fig. 11 is a schematic application layer diagram of a projection device according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that all the directional indicators (such as up, down, left, right, front, and rear … …) in the embodiment of the present invention are only used to explain the relative position relationship between the components, the movement situation, etc. in a specific posture (as shown in the drawing), and if the specific posture is changed, the directional indicator is changed accordingly.
The terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present application, "a plurality" means two or more unless otherwise specified.
In the description of the present application, it is to be noted that the terms "connected" and "connected" are to be interpreted broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected, unless explicitly stated or limited otherwise. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art. In addition, when a pipeline is described, the terms "connected" and "connected" are used in this application to have a meaning of conducting. The specific meaning is to be understood in conjunction with the context.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
Fig. 1(a) is a schematic diagram of a projection system according to an embodiment of the present disclosure. As shown in fig. 1(a), the projection system includes a projection apparatus 100 and a projection screen 200.
In practical use, the light outlet of the projection apparatus 100 faces a projection screen, and emits a light beam to the projection screen, and the projection screen is used for reflecting the light beam to realize display of pictures.
Alternatively, projection device 100 may have other names, such as, but not limited to, a projection host, etc.
Projection apparatus 100 is described with reference to fig. 1(a) and 1 (b). The projection device 100 includes a first ranging sensor 101, a second ranging sensor 102, a projection assembly 103, and a controller 104. It should be understood that fig. 1(a) and 1(b) only show some of the components of projection device 100, and that other components not shown may also be present in projection device 100.
The projection module 103 is configured to emit light to project a picture on the projection screen 200.
For example, the projection component 103 may include a projection lens 1031, a projection light source 1032, and an optical engine 1033. The projection light source 1032 may emit light for image projection, the light emits to the optical engine 1033, and is modulated by the optical engine 1033 according to the image to be displayed and then emits to the projection lens 1031, and the projection lens 1031 may project the incident light from the light outlet to form a projection image on the projection screen 200.
As shown in fig. 1(a), the first distance measuring sensor 101 and the second distance measuring sensor 102 are disposed on a side of the projection apparatus 100 close to the light outlet. The first ranging sensor 101 is used to detect a first distance S1 between the first ranging sensor 101 and the projection screen. The second ranging sensor 102 is used to detect a second distance S2 between the second ranging sensor 102 and the projection screen.
The distance between the first ranging sensor 101 and the second ranging sensor 102 is greater than or equal to a first preset threshold value and smaller than a second preset threshold value. The first preset threshold may be 15cm or 20cm, and the second preset threshold may be 20cm or 25cm, which is determined by the width of the projection apparatus 100 near the light outlet.
Optionally, when the projection device is placed on a horizontal plane, a connection line between the first ranging sensor 101 and the second ranging sensor 102 is parallel to the horizontal line.
Optionally, the first distance measuring sensor 101 and the second distance measuring sensor 102 may be laser distance measuring sensors. When the laser ranging sensor is used for detecting the distance between the laser ranging sensor and the test target, the laser ranging sensor can firstly aim at the test target to emit laser pulses, and the laser can be scattered in all directions after being reflected by the test target. Wherein the scattered light is received by the laser ranging sensor, the time from the emission of the light pulse to the return of the light pulse is recorded and processed, and the distance between the laser ranging sensor and the test target can be measured according to the propagation speed of the light.
Fig. 2 shows a schematic diagram of the composition of a first distance measuring sensor 101. As shown in fig. 2, the first distance measuring sensor 101 at least includes modules such as a control center 301, a power driver 302, a laser 303, an optical lens 304, an optical filtering module 305, and a data processing module 306.
The control center 301 is a control and calculation center of the entire sensor, and the control center 301 can record the time elapsed from the emission to the return of each optical signal, that is, the time interval between the emission pulse and the reception pulse, and determine the distance between the ranging sensor 300 and the test object according to this time interval and the propagation speed of the light.
The power driver 302 is used for providing voltage and current for the laser 303 under the control of the control center 301, so that the laser 303 can normally operate.
The laser 303 is a device capable of emitting laser light, and is used to emit a laser pulse wave under the control of the control center 301. Optionally, in this embodiment of the present application, the Laser 303 may be a Vertical-Cavity Surface-Emitting Laser (VCSEL), and the VCSEL Laser may emit an infrared Laser pulse wave with a wavelength of about 940nm, where an emitted light beam is circular, and has a small divergence angle, and is more easily coupled with an optical fiber and other optical elements.
The optical lens 304 is used for transmitting laser light and collimating the light shape.
The optical filtering module 305 has a function of filtering light with other wavelengths and ambient light according to wavelength, so that the distance measuring sensor 300 only receives the laser light reflected by the wavelength of 940 nm.
The data processing module 306 has a light sensing function sub-module, a light signal analysis sub-module, and a photoelectric conversion sub-module therein. The photosensitive functional sub-module in the data processing module 306 is composed of a plurality of photosensitive arrays, and the photosensitive functional sub-module receives the pulse wave reflected from the target object.
It is to be understood that the specific structure of the second ranging sensor 102 may refer to the specific structure of the first ranging sensor 101 described in fig. 2.
Optionally, with continued reference to fig. 1(a) and 1(b), the projection apparatus 100 may further include a first camera 105 and a second camera 106. The first camera 104 and the second camera 105 are both disposed on a side of the projection apparatus 100 close to the light exit to capture a projection image on the projection screen for a subsequent image correction process.
In an embodiment of the present application, the controller 104 is configured to: acquiring a first distance and a second distance; when the first distance is different from the initial distance and/or the second distance is different from the initial distance, correcting the image to be projected; the initial distance is the distance between the first ranging sensor and the projection screen detected when the projection device is in the initial attitude, and the initial distance is also the distance between the second ranging sensor and the projection screen detected when the projection device is in the initial attitude.
It should be noted that, in the process of using the projection apparatus 100 and the projection screen 200 together, the positions of the projection apparatus 100 and the projection screen 200 must be in a reasonable alignment relationship, so that the image to be projected in the projection apparatus 100 can be matched with the projection screen 200.
Thus, when the projection apparatus 100 is initially started, an installer may place the projection apparatus 100 and the projection screen 200 at appropriate positions and complete projection debugging of the projection apparatus 100, so that an image to be projected in the projection apparatus 100 may completely match the projection screen 200. At this time, the relative posture of the projection apparatus 100 and the projection screen 200 is the above-described initial posture. The controller 104 may obtain the distance between the projection screen 200 and the first or second distance sensor 101 or 102, i.e. the initial distance, and record the distance in the local database of the projection device 100 for subsequent recall.
Since the first ranging sensor 101 and the second ranging sensor 102 are both disposed on one side of the projection apparatus 100 close to the light exit, the distance between the first ranging sensor 101 and the projection screen 200 is equal to the distance between the second ranging sensor 102 and the projection screen 200.
During the subsequent use of the projection apparatus 100, the placement position and the angle between the projection apparatus 100 and the projection screen 200 should be kept unchanged to ensure good projection effect. However, the projection device or the projection screen may be displaced due to a situation that may occur such as a human movement or an accidental collision. Slight deviations in the projection device 100 or the projection screen 200 may also result in distortion or distortion of the projected image, and the controller 104 may correct the image to be projected.
Optionally, in addition to the first start-up, the first distance measuring sensor 101 and the second distance measuring sensor 102 may detect the distance to the projection screen 200 at each subsequent start-up of the projection apparatus 100 or during the operation of the projection apparatus 100. Accordingly, the controller 104 may automatically obtain the first distance and the second distance, and determine whether the projection device is deflected relative to the projection screen.
It will be appreciated that in general, the first distance and the second distance are equal to the initial distance when no movement of the projection device relative to the projection screen occurs. Based on the above, if the first distance is different from the initial distance, and/or the second distance is different from the initial distance, it indicates that the projection device moves relative to the projection screen. In this case, in order to ensure good viewing, the controller 104 needs to correct the image to be projected.
For example, the direction in which the first ranging sensor 101 points to the second ranging sensor 102 in the initial state of the projection apparatus 100 is defined as a first direction, the projection direction of the projection image in the initial state of the projection apparatus 100 is defined as a second direction, and the initial distance is S0.
If the controller 104 detects that the first distance and the second distance are not equal to the initial distance, and the first distance and the second distance are equal, i.e., S1 ≠ S2 ≠ S0. At this time, the projection apparatus 100 or the projection screen 200 may be shifted in the second direction or the vertical direction. Thereby causing the image to be projected of the projection apparatus 100 to not match the projection screen 200 at this time, and therefore the controller 104 can correct the image to be projected.
Alternatively, if the controller 104 detects that the first distance is not equal to the initial distance, and/or the second distance is not equal to the initial distance, i.e., S1 ≠ S0, and/or S2 ≠ S0, the projection device 100 or the projection screen 200 may be shifted in the first direction. Thereby causing the image to be projected of the projection apparatus 100 to not match the projection screen 200 at this time, and therefore the controller 104 can correct the image to be projected.
As can be seen from the above-mentioned situation that the first distance is not equal to the initial distance or the second distance is not equal to the initial distance, the greater the distance between the first ranging sensor 101 and the second ranging sensor 102 is, the more obvious the change of S1 or S2 is when the projection device 100 or the projection screen 200 is shifted in the first direction, which is beneficial to improving the detection accuracy.
It should be understood that the above embodiments only describe the projection device with two ranging sensors as an example. In actual use, more distance measuring sensors can be arranged on the projection device in order to more accurately determine whether the projection device moves relative to the projection screen. For example three ranging sensors.
Based on the projection system shown in fig. 1(a), as shown in fig. 3, an embodiment of the present application provides a method for controlling a projection apparatus, where the method includes the following steps:
s101, the controller acquires a first distance and a second distance.
The first distance is the distance between the first distance measuring sensor and the projection screen. The second distance is the distance between the second distance measuring sensor and the projection screen.
S102, when the first distance is different from the initial distance and/or the second distance is different from the initial distance, the controller instructs the projection device to correct the image to be projected.
The initial distance is the distance between the first ranging sensor and the projection screen when the projection device is in the initial posture, and the initial distance is the distance between the second ranging sensor and the projection screen when the projection device is in the initial posture.
Alternatively, a specific method for correcting the image to be projected may refer to the embodiment shown in fig. 7.
Based on the above embodiment, since the first distance measuring sensor is configured to detect a first distance between the first distance measuring sensor and the projection screen and the second distance measuring sensor is configured to detect a second distance between the second distance measuring sensor and the projection screen, the controller of the projection device may determine that the projection device is deflected relative to the projection screen when the first distance is different from the initial distance and/or the second distance is different from the initial distance. Under the condition, the controller of the projection equipment can automatically trigger the correction of the image to be projected without manually correcting by a user, so that the good watching experience of the user is ensured, and the operation of the user is reduced.
As shown in fig. 4(a), the present application provides a schematic diagram of another projection system. As shown in fig. 4(a), the projection system includes a projection apparatus 100 and a projection screen 200.
Projection apparatus 100 is described with reference to fig. 4(a) and 4 (b). Projection device 100 includes a projection assembly 103, a controller 104, and a motion sensor 107. It should be understood that fig. 4(a) and 4(b) only show some of the components of projection device 100, and that other components not shown may also be present in projection device 100.
Wherein the motion sensor 107 is used to detect the acceleration of the projection device 100.
Alternatively, the motion sensor 107 may be a three-axis motion sensor or a six-axis motion sensor. The three-axis motion sensor has the characteristics of small volume and light weight, and is convenient to install in the projection equipment. The three-axis motion sensor can detect the magnitude of acceleration in various directions (typically three axes) of the projection device 100, and can detect the magnitude and direction of the acceleration when the projection device is stationary. In the embodiment of the present application, the six-axis motion sensor is integrated with the three-axis gyroscope by a three-axis accelerometer, and besides detecting the acceleration of the projection device 100, the six-axis motion sensor may also detect the offset angle and the displacement of the projection device 100, so that the controller 104 may determine the position information of the projection device 100 according to the offset angle and the displacement, so as to correct the image to be projected.
In an embodiment of the present application, the controller 104 is configured to: acquiring the acceleration of the projection equipment; the controller corrects the image to be projected when the acceleration of the projection apparatus is different from the initial acceleration.
It should be noted that, in the initial posture of the projection apparatus 100, the acceleration of the projection apparatus 100 is the initial acceleration. At this time, the projection device 100 is in a relatively static state, and the initial acceleration is usually the acceleration detected by the motion sensor 107. During the actual use of the projection apparatus 100, the projection apparatus 100 may shift in the placement position or the placement angle, for example, the projection apparatus 100 may shift, roll or tilt under the action of an external force. At this time, the acceleration of the projection apparatus 100 in the vertical direction may be different from the initial acceleration, i.e., the acceleration.
Accordingly, the controller 104 determines that the projection apparatus 100 is in a motion state when the acceleration of the projection apparatus 100 in the vertical direction is different from the initial acceleration. Also, the posture of the projection apparatus 100 after the end of the movement may be different from the initial posture, so that the image to be projected may be matched with the projection screen 200, and therefore the controller 104 needs to correct the image to be projected.
Based on the projection system shown in fig. 4(a), as shown in fig. 5, an embodiment of the present application further provides another control method for a projection apparatus, where the method includes the following steps:
s201, the controller acquires the acceleration of the projection equipment in the vertical direction.
It will be appreciated that the acceleration of the projection device described above is the acceleration detected by the motion sensor when the projection device is in a relatively stationary state. And when the projection device is in a motion state, the acceleration of the projection device in the vertical direction may not be equal to the acceleration.
S202, when the acceleration of the projection device in the vertical direction is different from the initial acceleration, the controller corrects the image to be projected.
The initial acceleration is an acceleration detected by the motion sensor when the projection apparatus is in the initial posture, that is, an acceleration measured by the motion sensor when the projection apparatus 100 is in the relatively stationary state.
Alternatively, a specific method for correcting the image to be projected may refer to the embodiment shown in fig. 7.
Based on the embodiment, the motion sensor is arranged on the projection equipment, so that the controller can directly determine the placement position or the angle of the projection equipment to be deviated according to the acceleration of the projection equipment in the vertical direction, and the projected image is automatically corrected. The judgment mode is simpler and more direct, and the projection device can correct the image to be projected more quickly.
As an alternative embodiment, as shown in fig. 6(a), the embodiment of the present application provides a schematic diagram of another projection system. As shown in fig. 6(a), the projection system includes a projection apparatus 100 and a projection screen 200.
Projection apparatus 100 is described with reference to fig. 6(a) and 6 (b). The projection device 100 includes a first ranging sensor 101, a second ranging sensor 102, a projection assembly 103, a controller 104, and a motion sensor 107. It should be understood that fig. 6(a) and 6(b) only show some of the components of projection device 100, and that other components not shown may also be present in projection device 100.
The components of the projection apparatus 100 may refer to the description in the above embodiments, and are not described herein again.
As shown in fig. 7, an embodiment of the present application further provides a method for correcting a projection image, including the following steps:
s301, the controller controls and drives the projection assembly to project the test chart onto the projection screen.
The test graphic card is a geometric distortion test graphic card or a distortion test graphic card, the test graphic card is provided with regularly arranged patterns for assisting the distortion or distortion condition of the test image, and the test graphic card can be a grid test graphic card, a chessboard test graphic card or the dot test graphic card and the like.
Illustratively, fig. 8 shows a dot-shaped test pattern card having 16 regularly arranged dots, which constitute a 4 × 4 dot matrix. In this embodiment, the 16 regularly arranged points are feature points on the test chart, and the region where the feature points are located is a feature point region.
S302, the controller obtains a shooting image obtained by shooting the projection screen by the shooting device.
The shot image comprises a characteristic point area on the test chart.
In some embodiments, the controller is provided with a first camera and a second camera. Thus, step S302 may be specifically implemented as: and acquiring a first image obtained by shooting the projection screen by the first shooting device and a second image obtained by shooting the projection screen by the second shooting device.
And S303, the controller acquires the edge information of the shot image and the frame information of the projection screen.
For example, the controller may extract edge information of the photographed image and a frame image of the projection screen in the photographed image.
Further, the controller may determine, based on the extracted frame image of the projection screen, position information of the captured image at four vertices of a rectangle composed of the frame of the projection screen.
The position information of the four vertexes is pixel coordinate data of the four vertexes on the shot image. S304, the controller performs preliminary correction on the characteristic points of the shot image.
For example, the controller may determine the perspective transformation parameter according to position information of the photographed image at four vertices of a rectangle composed of a frame of the projection screen, and edge information of the photographed image.
It should be noted that, since the relative position between the projection device and the projection screen is shifted, the shooting angle of the projection screen shot by the shooting device is also changed, and the perspective transformation parameter is changed along with the change of the shooting angle of the device. Therefore, the controller can carry out preliminary correction on the characteristic points of the shot images according to the perspective transformation parameters, so that the shooting device can keep a better shooting effect.
Further, the controller may acquire position information of the feature point of the captured image. And the position information of the characteristic points is pixel coordinate data of the characteristic points on the test chart on the shot image.
Therefore, the controller can perform preliminary correction on the feature points on the projection image according to the acquired perspective transformation parameters.
S305, the controller determines the correction parameters of the image to be projected according to the position information before the characteristic point correction and the position information after the characteristic point correction. Specifically, the controller may determine the corrected location information of the feature point, that is, the pixel coordinate data of the feature point on the test card corrected on the shot image.
Further, the controller may compare pixel coordinate data of the feature point on the test card before the correction on the shot image with pixel coordinate data of the feature point on the test card after the correction on the shot image, and calculate the correction parameter of the image to be projected.
S306, the controller corrects the image to be projected according to the correction parameters.
In some embodiments, a projection device is provided with a first camera and a second camera. Thus, the controller may perform the above-described steps S303 to S305 to determine the first correction parameter based on the first image obtained by the first photographing device photographing the projection screen. And synchronously executing the above steps S303-S305 to determine a second correction parameter based on a second image obtained by the second photographing device photographing the projection screen.
Further, step S306 may be specifically implemented as: the controller corrects the image to be projected according to the first correction parameter and the second correction parameter.
It should be understood that the controller may control the driving of the projection assembly to project the corrected image to be projected onto the projection screen such that the projected image displayed on the projection screen is rectangular to provide a good viewing experience for the user. It should be understood that fig. 7 only illustrates one method for correcting the projected image, and in practical applications, the controller may also adopt other correction methods, which is not limited in this application.
As shown in fig. 9, the projection system 10 in the embodiment of the present application is a schematic diagram of an application scenario in which the projection system interacts with the control device 400 and the server 500.
The control device 400 may be a remote controller 400A, which can communicate with the projection apparatus 100 through an infrared protocol communication, a bluetooth protocol communication, a ZigBee (ZigBee) protocol communication, or other short-range communication, and is used to control the projection apparatus 100 through a wireless or other wired manner. The user may input a user command to control the projection apparatus 100 by inputting a key on the remote controller 400A, voice input, control panel input, etc. Such as: the user may input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right movement keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller 400A, so as to implement the functions of the projection apparatus 100.
The control device 400 may also be an intelligent device, such as a mobile terminal 400B, a tablet computer, a notebook computer, etc., which may communicate with the multimedia controller 100 through a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), or other networks, and implement control of the projection apparatus 100 through an application program corresponding to the multimedia controller 100. For example, projection device 100 is controlled using an application running on a smart device. The application may provide various controls to the User through an intuitive User Interface (UI) on a screen associated with the smart device.
For example, the mobile terminal 400B and the projection device 100 may each have a software application installed thereon, so that connection communication between the two can be realized through a network communication protocol, and thus, the purpose of one-to-one control operation and data communication can be realized. Such as: a control instruction protocol can be established between the mobile terminal 400B and the projection device 100, a remote control keyboard is synchronized to the mobile terminal 400B, and the function of the multimedia controller 100 is controlled by controlling a user interface on the mobile terminal 400B; the audio and video contents displayed on the mobile terminal 400B may also be transmitted to the multimedia controller 100, so as to implement a synchronous display function.
The server 500 may be a video server, an Electronic Program Guide (EPG) server, a cloud server, or the like.
Projection device 100 may be in data communication with server 500 via a variety of communication means. In various embodiments of the present application, the projection device 100 may be allowed to be in a wired or wireless communication connection with the server 500 via a local area network, a wireless local area network, or other network. Server 500 may provide various content and interactions to projection device 100.
Illustratively, projection device 100 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as EPG interactions. The servers 500 may be a group or groups, and may be one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 500.
Fig. 10 is a block diagram schematically showing the configuration of the control device 400 according to the exemplary embodiment. As shown in fig. 10, the control device 400 includes a controller 410, a communicator 430, a user input/output interface 440, a memory 490, and a power supply 480.
The control device 400 is configured to control the projection apparatus 100, and to receive an input operation instruction from a user, and convert the operation instruction into an instruction recognizable and responsive by the projection apparatus 100, so as to mediate interaction between the user and the projection apparatus 100. Such as: the user operates the channel up/down key on the control device 400, and the projection apparatus 100 responds to the channel up/down operation.
In some embodiments, the control apparatus 400 may be a smart device. Such as: the control device 400 may be installed to control various applications of the projection apparatus 100 according to user requirements.
In some embodiments, the mobile terminal 400B or other intelligent electronic device may function similar to the control apparatus 400 after an application for operating the projection device 100 is installed. Such as: the user may implement the functions of controlling the physical keys of the apparatus 400 by installing applications, various function keys or virtual buttons of a graphical user interface available on the mobile terminal 400B or other intelligent electronic devices.
The controller 410 includes a processor 412, a RAM 413 and a ROM 414, a communication interface, and a communication bus. The controller 410 is used to control the operation of the control device 400, as well as the internal components for communication and coordination and external and internal data processing functions.
The communicator 430 enables communication of control signals and data signals with the projection apparatus 100 under the control of the controller 410. Such as: the received user input signal is transmitted to the projection device 100. The communicator 430 may include at least one of a WIFI module 431, a bluetooth module 432, a Near Field Communication (NFC) module 433, and the like.
A user input/output interface 440, wherein the input interface includes at least one of a microphone 441, a touch pad 442, a sensor 443, keys 444, a camera 445, and the like. Such as: the user may implement a user instruction input function through voice, touch, gesture, pressing, and the like, and the input interface converts the received analog signal into a digital signal and converts the digital signal into a corresponding instruction signal, and sends the instruction signal to the projection apparatus 100.
The output interface includes an interface that transmits the received user instruction to the projection device 100. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, a user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to the projection device 100 through the infrared sending module. The following steps are repeated: when the rf signal interface is used, a user input command needs to be converted into a digital signal, and then the digital signal is modulated according to a rf control signal modulation protocol and then transmitted to the projection device 100 through the rf transmitting terminal.
In some embodiments, the control device 400 includes at least one of a communicator 430 and an output interface. The control device 400 is configured with a communicator 430, such as: the modules such as the WIFI, the bluetooth, and the NFC may transmit the user input command to the projection device 100 through a WIFI protocol, a bluetooth protocol, or an NFC protocol code.
A memory 490 for storing various operation programs, data and applications for driving and controlling the control apparatus 400 under the control of the controller 410. The memory 490 may store various control signal commands input by a user.
And a power supply 480 for providing operation power support for each electrical component of the control device 400 under the control of the controller 410. The power supply 480 may be implemented using a battery and associated control circuitry.
In addition, as shown in FIG. 11, the application layer of the projection device may include various applications that may be executed on projection device 100.
The live television application program can provide live television through different signal sources. For example, a live television application may provide television signals using input from cable television, radio broadcasts, satellite services, or other types of live television services. And, the live television application may display video of the live television signal on projection device 100.
A video-on-demand application may provide video from different storage sources. Unlike live television applications, video on demand provides a video display from some storage source. For example, the video on demand may come from a server side of the cloud storage, from a local hard disk storage containing stored video programs.
The media center application program can provide various applications for playing multimedia contents. For example, a media center, which may be other than live television or video on demand, may provide services that a user may access to various images or audio through a media center application.
The application program center can provide and store various application programs. The application may be a game, an application, or some other application associated with a computer system or other device that may be run on a display device. The application center may obtain these applications from different sources, store them in local storage, and then be operational on projection device 100.
The embodiment of the present application further provides a computer-readable storage medium, which includes computer-executable instructions, and when the computer-readable storage medium is run on a computer, the computer is caused to execute any one of the methods provided by the above embodiments.
The embodiments of the present application also provide a computer program product containing instructions for executing a computer, which when executed on a computer, causes the computer to perform any one of the methods provided by the above embodiments.
An embodiment of the present application further provides a chip, including: a processor coupled to the memory through the interface, and an interface, when the processor executes the computer program or the computer execution instructions in the memory, the processor causes any one of the methods provided by the above embodiments to be performed.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer-executable instructions. The processes or functions described in accordance with the embodiments of the present application occur, in whole or in part, when computer-executable instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer executable instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer executable instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer-readable storage media can be any available media that can be accessed by a computer or can comprise one or more data storage devices, such as servers, data centers, and the like, that can be integrated with the media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
While the present application has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A projection device, characterized in that the projection device comprises:
the projection assembly is used for projecting an image to be projected onto a projection screen;
a first ranging sensor for detecting a first distance between the first ranging sensor and the projection screen;
a second ranging sensor for detecting a second distance between the second ranging sensor and the projection screen;
a controller connected to the projection assembly, the first ranging sensor, and the second ranging sensor, respectively, the controller configured to:
acquiring the first distance and the second distance;
when the first distance is different from an initial distance and/or the second distance is different from the initial distance, correcting the image to be projected; the initial distance is a distance between the projection screen and the first ranging sensor when the projection device is in an initial posture, and the initial distance is a distance between the projection screen and the second ranging sensor when the projection device is in the initial posture.
2. The projection device of claim 1, wherein a line between the first ranging sensor and the second ranging sensor is parallel to a horizontal line when the projection device is placed on a horizontal surface.
3. The projection device of claim 2, wherein the first distance measuring sensor and the second distance measuring sensor are both disposed on a side of the projection device near the light outlet.
4. The projection device of any of claims 1 to 3, wherein the projection device further comprises: a motion sensor connected to the controller;
the motion sensor is used for detecting the acceleration of the projection equipment;
the controller is further configured to:
acquiring the acceleration of the projection equipment;
and when the acceleration of the projection equipment is different from the initial acceleration, correcting the image to be projected, wherein the initial acceleration is the acceleration detected by the motion sensor when the projection equipment is in the initial posture.
5. The projection device of claim 4, wherein the motion sensor is a three-axis motion sensor or a six-axis motion sensor.
6. A projection device, characterized in that the projection device comprises:
the projection assembly is used for projecting an image to be projected onto a projection screen;
a motion sensor for detecting an acceleration of the projection device;
a controller connected to the motion sensor and the projection assembly, respectively, the controller configured to:
acquiring the acceleration of the projection equipment;
and when the acceleration of the projection equipment is different from the initial acceleration, correcting the image to be projected, wherein the initial acceleration is the acceleration detected by the motion sensor when the projection equipment is in the initial posture.
7. A projection system, comprising:
a projection screen and a projection device according to any one of claims 1 to 6;
wherein the projection device is configured to project a projection image on the projection screen.
8. A projection method applied to a projection apparatus configured with a first ranging sensor and a second ranging sensor, the method comprising:
acquiring a first distance between a first ranging sensor and a projection screen and a second distance between a second ranging sensor and the projection screen;
when the first distance is different from the initial distance and/or the second distance is different from the initial distance, correcting the image to be projected; the initial distance is a distance between the projection screen and the first ranging sensor when the projection device is in an initial posture, and the initial distance is a distance between the projection screen and the second ranging sensor when the projection device is in the initial posture.
9. The method of claim 8, wherein the projection device further comprises a motion sensor, the method further comprising:
acquiring the acceleration of the projection equipment detected by the motion sensor;
and when the acceleration of the projection equipment is different from the initial acceleration, correcting the image to be projected, wherein the initial acceleration is the acceleration detected by the motion sensor when the projection equipment is in the initial posture.
10. A computer program readable storage medium, characterized in that the computer program readable storage medium comprises computer program instructions, which when read by a computer, the computer performs the method of claim 8 or 9.
CN202111250949.3A 2021-10-26 2021-10-26 Projection equipment and control method thereof Pending CN113965740A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111250949.3A CN113965740A (en) 2021-10-26 2021-10-26 Projection equipment and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111250949.3A CN113965740A (en) 2021-10-26 2021-10-26 Projection equipment and control method thereof

Publications (1)

Publication Number Publication Date
CN113965740A true CN113965740A (en) 2022-01-21

Family

ID=79467259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111250949.3A Pending CN113965740A (en) 2021-10-26 2021-10-26 Projection equipment and control method thereof

Country Status (1)

Country Link
CN (1) CN113965740A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114740953A (en) * 2022-04-08 2022-07-12 中国长城科技集团股份有限公司 Notebook computer

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114740953A (en) * 2022-04-08 2022-07-12 中国长城科技集团股份有限公司 Notebook computer

Similar Documents

Publication Publication Date Title
EP3191888B1 (en) Scanning laser planarity detection
CN102540673B (en) Laser point position determination system and method
JP2017504047A (en) Scanning laser proximity detection
US20140132498A1 (en) Remote control using depth camera
US20220239876A1 (en) Information processing device, information processing method, program, projection device, and information processing system
US20180357036A1 (en) Display system, display device, and method of controlling display system
CN113973195A (en) Projection equipment and correction method thereof
US20210397296A1 (en) Information processing device, information processing method, and program
CN113052884A (en) Information processing method, information processing apparatus, storage medium, and electronic device
CN113965740A (en) Projection equipment and control method thereof
CN109542218B (en) Mobile terminal, human-computer interaction system and method
CN114760454A (en) Projection equipment and trigger correction method
US10447996B2 (en) Information processing device and position information acquisition method
CN111178306B (en) A display control method and electronic device
CN103593050A (en) Method and system for selecting news screen and transmitting picture through mobile terminal
CN103295387B (en) A kind of electronic equipment, projection remote controller and its implementation
CN105677030B (en) A kind of control method and electronic equipment
CN114549658A (en) Camera calibration method and device and electronic equipment
JP2002243446A (en) Position data setting apparatus and environmental data obtaining apparatus
KR20180031238A (en) Mobile terminal and method for controlling the same
KR20110032224A (en) System and method for providing user interface by gesture and gesture signal generator and terminal for same
JP2017183776A (en) Display device, and control method of display device
JP2017125764A (en) Object detection device and image display device provided with object detection device
CN116055774B (en) Display device and control method thereof
CN116069433B (en) Image drifting method and system based on virtual desktop infrastructure and action recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination