CN216356893U - Electronic equipment - Google Patents
Electronic equipment Download PDFInfo
- Publication number
- CN216356893U CN216356893U CN202121776632.9U CN202121776632U CN216356893U CN 216356893 U CN216356893 U CN 216356893U CN 202121776632 U CN202121776632 U CN 202121776632U CN 216356893 U CN216356893 U CN 216356893U
- Authority
- CN
- China
- Prior art keywords
- range
- electronic device
- radar
- target object
- relative
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000006698 induction Effects 0.000 claims abstract description 5
- 230000001121 heart beat frequency Effects 0.000 claims description 13
- 230000036391 respiratory frequency Effects 0.000 claims description 11
- 230000036387 respiratory rate Effects 0.000 claims description 5
- 238000000034 method Methods 0.000 abstract description 5
- 230000000903 blocking effect Effects 0.000 description 5
- 230000035565 breathing frequency Effects 0.000 description 5
- 230000033001 locomotion Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 235000001808 Ceanothus spinosus Nutrition 0.000 description 1
- 241001264786 Ceanothus spinosus Species 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
Images
Landscapes
- Radar Systems Or Details Thereof (AREA)
Abstract
An electronic device includes a body; the radar component is arranged on the body, corresponds to the induction area, covers the first range, and is used for collecting and processing radar data collected by the radar component; the camera shooting assembly is arranged on the body, corresponds to the acquisition area, and covers the second range, and the first range covers the second range; the camera shooting assembly is used for collecting and processing collected data collected by the camera shooting assembly; and the processor is respectively connected with the radar component and the camera shooting component directly or indirectly, and is used for calibrating the processed radar data and then acquiring the target object in the data. According to the method and the device, radar data are collected and processed by the radar component, and the target object is calibrated by the processor based on the processed radar data, so that continuous focusing on the target object is ensured, the situation that the target object is fuzzy due to discontinuous focusing is avoided, and the video quality is improved.
Description
Technical Field
The application relates to the technical field of video acquisition, in particular to an electronic device.
Background
When a video is shot, because a shot target (such as a person, an animal and the like) is often in a moving state, a focus created at the initial stage of video shooting is easily incapable of being focused with the target in time due to the movement of the target, and because the focus is not continuously focused in the whole process, the main body in the video is fuzzy, so that the quality of the shot video is low.
For example, when there is a blocking object (wall, etc.) between the camera and the object to be photographed, the camera loses focus when the object to be photographed is blocked by the blocking object, and the camera cannot determine and focus the object in time even after the object is separated from the blocking object; especially at night, when the light is not good, the target body cannot be focused accurately and continuously.
SUMMERY OF THE UTILITY MODEL
In view of this, an embodiment of the present application provides an electronic device, so as to solve the following problems in the prior art: the focus created at the initial stage of video shooting can not be focused with the target body in time easily due to the movement of the target body, and the situation of main body blur exists in the video due to no whole-course continuous focusing, so that the quality of the shot video is low.
In one aspect, an embodiment of the present application provides an electronic device, which includes:
a body;
the radar component is arranged on the body, corresponds to an induction area, covers a first range, and is used for acquiring and processing radar data acquired by the radar component;
the camera shooting assembly is arranged on the body and corresponds to a collecting area, the collecting area covers a second range, and the first range covers the second range; the camera shooting assembly is used for collecting and processing collected data collected by the camera shooting assembly;
and the processor is respectively connected with the radar component and the camera component directly or indirectly, and is used for calibrating and processing the processed radar data and then acquiring the target object in the data.
In some embodiments, the radar assembly comprises:
a transmitter coupled to the transmit antenna for providing the transmitted modulated millimeter wave signal;
the receiver is connected with the receiving antenna and used for receiving the echo signal;
a band microprocessor for obtaining relative parameters of the object of the first range with respect to the electronic device based on the modulated millimeter wave signal and the echo signal; the relative parameter is taken as the radar parameter.
In some embodiments, the relative parameters include at least one of: the first range of objects is relative distance, relative velocity, heartbeat, and/or respiratory rate with respect to the electronic device.
In some embodiments, the processor comprises:
and the first processing module is used for calibrating the object corresponding to the first range in the acquired data based on the relative parameter.
In some embodiments, the processor further comprises:
and the second processing module is used for determining a living body object based on the relative parameters of the objects corresponding to the first range in the acquired data, and controlling the acquired data to be displayed through a display screen and displaying a living body mark.
In some embodiments, in the case that a plurality of living objects are included in the first range, the second processing module is specifically configured to:
determining the living body object corresponding to the heartbeat and/or respiratory frequency which is the same as the preset heartbeat and/or respiratory frequency as a target object, and differentially setting the living body mark of the target object to the living body marks of other living body objects.
In some embodiments, the processor further comprises:
and the third processing module is used for predicting a focusing position based on the relative parameters of the objects corresponding to the first range in the acquired data and controlling the camera shooting assembly based on the focusing position.
In some embodiments, the third processing module is specifically configured to:
determining the living body object corresponding to the heartbeat and/or respiratory frequency which is the same as the preset heartbeat and/or respiratory frequency as a target object, and predicting a focusing position based on the relative speed of the target object.
In some embodiments, the third processing module is further configured to:
and predicting an in-focus position based on the target object corresponding to the same relative speed as a preset speed in the case where the target object is plural.
In some embodiments, the third processing module is further configured to:
predicting an in-focus position based on the first range and the relative speed.
The technical effects of the embodiment of the application are as follows: the radar component is used for collecting and processing radar data, and the processor is used for calibrating the target object based on the processed radar data so as to ensure that the target object is continuously focused, avoid the situation that the target object is fuzzy due to discontinuous focusing and improve the video quality.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present disclosure, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to a first embodiment of the present application.
Reference numerals:
1-body; 2-a radar component; 3-a camera assembly; 4-a processor; 21-a transmitter; 22-a receiver; 23-with a microprocessor; 41-a first processing module; 42-a second processing module; 43-third processing module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings of the embodiments of the present application. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the application without any inventive step, are within the scope of protection of the application.
Unless defined otherwise, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. As used in this application, the terms "first," "second," and the like do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element or item listed before the word covers the element or item listed after the word and its equivalents, but does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
Detailed descriptions of known functions and known components are omitted in the present application in order to keep the following description of the embodiments of the present application clear and concise.
The embodiment of the application provides electronic equipment which can shoot videos and can ensure real-time focusing of a target object in the shooting process, and even if the target object is shielded by a shielding object, focusing of the target object can be achieved to a certain extent, and the purpose of improving the video quality is achieved.
Specifically, as shown in fig. 1, the electronic device provided in the embodiment of the present application includes a main body 1, and the main body 1 serves as a support or a carrier to carry various components of the electronic device, so as to ensure that the circuits of the electronic device can be normally connected.
Further, the electronic device further comprises a radar component 2, wherein the radar component 2 is arranged on the body 1. The radar component 2 corresponds to an induction area, and the induction area is an area where the radar component 2 can acquire radar data; further, the sensing area covers a first range, and the radar component 2 is used for collecting and processing radar data collected by the radar component 2 in the first range.
Here, the first range is a range determined by the obstacle, and at this time, even if the object is blocked by the blocking object, radar data corresponding to the object can be acquired through the radar component 2; of course, the first range may also be a range corresponding to the entire sensing area, and at this time, even if the object is blocked by the blocking object, the radar data corresponding to the object may be collected by the radar component 2.
Specifically, the radar assembly 2 includes a transmitter 21, a receiver 22, and a band microprocessor 23. The transmitter 21 is connected to a transmitting antenna, and in a specific implementation, the transmitter 21 modulates the millimeter-wave signal according to a preset rule, and transmits the millimeter-wave signal to the sensing area through the connected transmitting antenna. Then, the receiving antenna acquires the echo signal returned based on the millimeter wave signal in real time, where the receiver 22 is connected to the receiving antenna, and the receiving antenna transmits the acquired echo signal to the receiver 22. Thereby, the acquisition of radar data is accomplished by the transmitter 21, the transmitting antenna, the receiver 22 and the receiving antenna.
It is worth mentioning that the speed measurement is based on the frequency doppler effect principle of the radar unit 2 due to the relative motion between itself and the target. The frequency of the echo signal received by the radar component 2 is different from the frequency of the millimeter wave signal, the difference between the two is called doppler frequency, one of the main information extractable from the doppler frequency is the rate of change of the distance between the radar component 2 and the object, and even if the object and the interference clutter exist in the same spatial resolution unit of the radar component 2 at the same time, the radar component 2 can detect and track the object from the interference clutter by using the difference of the doppler frequency. Therefore, the embodiment of the present application preferably employs the radar component 2 to acquire radar parameters for the purpose of tracking an object and continuously focusing the object.
The transmitter 21 transmits the millimeter wave signal to the strip microprocessor 23 while transmitting the millimeter wave signal to the sensing region through the connected transmitting antenna, and likewise, the receiver 22 transmits the echo signal to the strip microprocessor 23 after receiving the echo signal. It should be noted that there are cases where millimeter wave signals are continuously transmitted, and therefore, the received echo signals need to establish a correspondence relationship with the millimeter wave signals to ensure accuracy of radar data, where the correspondence relationship may be set based on transmission time, transmission order, preset tags, and the like.
After receiving the millimeter wave signal and the echo signal corresponding to the millimeter wave signal, the band microprocessor 23 obtains the relative parameter of the object in the first range with respect to the electronic device based on the modulated millimeter wave signal and the echo signal, and in a specific implementation, the band microprocessor 23 may obtain the relative parameter by performing an analog-to-digital conversion operation, and of course, may also perform signal noise reduction and other processing as long as it can ensure that the relatively accurate relative parameter is obtained. Wherein, the relative parameters at least comprise at least one of the following parameters: the first range of objects is relative distance, relative velocity, heartbeat, and/or respiratory rate with respect to the electronic device. Here, the relative parameter is taken as a radar parameter.
The electronic device further comprises a camera assembly 3, and the camera assembly 3 is also arranged on the body 1. The camera component 3 corresponds to the acquisition area, the acquisition area covers the second range, the camera component 3 is used for acquiring and processing the acquisition data acquired by the camera component 3 in the second range, and similarly, the process of processing the acquisition data by the camera component 3 also involves analog-to-digital conversion, denoising and the like, which is not described herein in detail. Further, the first range in the embodiment of the present application covers the second range to determine that the radar component 2 can completely acquire radar data in the second range, so as to achieve subsequent focusing on the target object.
The electronic equipment provided by the embodiment of the application further comprises a processor 4, wherein the processor 4 is directly or indirectly connected with the radar component 2, so that radar data collected and processed by the radar component 2 can be acquired in real time; meanwhile, the processor 4 is also directly or indirectly connected with the camera component 3, so as to acquire the collected data collected and processed by the camera component 3 in real time. After the radar data and the collected data are acquired, the processor 4 calibrates the target object in the collected data after the radar data is processed.
Specifically, the processor 4 includes a first processing module 41, and after the radar data and the collected data are acquired, the first processing module 41 calibrates an object corresponding to a first range in the collected data based on a relative parameter, that is, the radar data. Specifically, the object in the second range may be determined according to a preset calibration rule, for example, an image corresponding to the first range is determined based on the collected data, and the image is identified, analyzed, and the like, so as to calibrate the object in the first range; then, the distance between the object and the camera assembly 3 is determined based on the acquired data and the camera assembly 3, and the position of the object in the first range is determined and calibrated based on the distance and the relative distance in the relative parameters. Of course, the calibration may also be performed based on the relative speed in the relative parameter, or may also be performed based on the relative distance and the relative speed, which is not specifically limited in this embodiment of the application. The object may be a person, an animal, a vehicle, or the like.
Further, the processor 4 further includes a second processing module 42, after completing calibration of the object in the first range, determining a living object based on the relative parameters of the object corresponding to the first range in the acquired data, specifically, checking whether the relative parameters corresponding to each calibrated object include a heartbeat and/or a respiratory frequency, or checking whether the heartbeat and/or the respiratory frequency is not 0, and if so, determining that the object is the living object. After the living object is determined, the display of the acquired data through the display screen and the display of a living mark of the living object, for example, the addition of a heart shape or the like on the living object, are controlled.
In a specific implementation, there is a case where only one object needs to be tracked within the first range, at this time, the second processing module 42 acquires a preset heartbeat and/or a breathing frequency pre-stored in the electronic device, compares the preset heartbeat and/or breathing frequency with the heartbeat and/or breathing frequency of each living object, determines that a living object corresponding to the same heartbeat and/or breathing frequency as the preset heartbeat and/or breathing frequency is a target object, differently sets a living mark of the target object to a living mark of another living object, for example, sets a living mark of another living object to be a heart shape, and sets a living mark of the target object to be a red heart shape.
The processor 4 further comprises a third processing module 43, the third processing module 43 predicts a focus position based on the relative parameter of the object corresponding to the first range in the acquired data, and controls the camera assembly 3 based on the focus position. Specifically, the third processing module 43 determines that the living body object corresponding to the heartbeat and/or respiratory rate identical to the preset heartbeat and/or respiratory rate is the target object, and then predicts the focusing position based on the relative speed of the target object.
In a specific implementation, there may also be a plurality of target objects having the same preset heartbeat and/or respiratory frequency, and at this time, the final target object may be further determined by the relative speed of each target object, that is, the preset speed stored in advance is compared with the relative speed of each target object, the target object corresponding to the same relative speed as the preset speed is determined as the final target object, and the focusing position is predicted based on the target object corresponding to the same relative speed as the preset speed.
Specifically, the movement locus of the target object within the first range, which is finally determined, is calculated, the focus position at each time is determined based on the movement locus and the relative speed of the target object, and the image pickup assembly 3 is controlled to perform focus shooting at the determined focus position.
The electronic equipment of the embodiment of the application utilizes the radar component 2 to collect and process radar data, and calibrates the target object based on the processed radar data through the processor 4, so as to ensure continuous focusing on the target object, avoid the situation that the target object is fuzzy due to discontinuous focusing, and improve the video quality.
Moreover, although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments based on the present application with equivalent elements, modifications, omissions, combinations (e.g., of various embodiments across), adaptations or alterations. The elements of the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more versions thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the above detailed description, various features may be grouped together to streamline the application. This should not be interpreted as an intention that a disclosed feature not claimed is essential to any claim. Rather, subject matter of the present application can lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the application should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
The embodiments of the present application have been described in detail, but the present application is not limited to these specific embodiments, and those skilled in the art can make various modifications and modified embodiments based on the concept of the present application, and these modifications and modified embodiments should fall within the scope of the present application.
Claims (10)
1. An electronic device, characterized in that the electronic device comprises:
a body;
the radar component is arranged on the body, corresponds to an induction area, covers a first range, and is used for acquiring and processing radar data acquired by the radar component;
the camera shooting assembly is arranged on the body and corresponds to a collecting area, the collecting area covers a second range, and the first range covers the second range; the camera shooting assembly is used for collecting and processing collected data collected by the camera shooting assembly;
and the processor is respectively connected with the radar component and the camera component directly or indirectly, and is used for calibrating and processing the processed radar data and then acquiring the target object in the data.
2. The electronic device of claim 1, wherein the radar component comprises:
a transmitter coupled to the transmit antenna for providing the transmitted modulated millimeter wave signal;
the receiver is connected with the receiving antenna and used for receiving the echo signal;
a band microprocessor for obtaining relative parameters of the object of the first range with respect to the electronic device based on the modulated millimeter wave signal and the echo signal; the relative parameter is used as a radar parameter.
3. The electronic device of claim 2, wherein the relative parameters include at least one of: the first range of objects is relative distance, relative velocity, heartbeat, and/or respiratory rate with respect to the electronic device.
4. The electronic device of claim 3, wherein the processor comprises:
and the first processing module is used for calibrating the object corresponding to the first range in the acquired data based on the relative parameter.
5. The electronic device of claim 3, wherein the processor further comprises:
and the second processing module is used for determining a living body object based on the relative parameters of the objects corresponding to the first range in the acquired data, and controlling the acquired data to be displayed through a display screen and displaying a living body mark.
6. The electronic device of claim 5, wherein, in the case that a plurality of living objects are included in the first range, the second processing module is specifically configured to:
determining the living body object corresponding to the heartbeat and/or respiratory frequency which is the same as the preset heartbeat and/or respiratory frequency as a target object, and differentially setting the living body mark of the target object to the living body marks of other living body objects.
7. The electronic device of claim 3, wherein the processor further comprises:
and the third processing module is used for predicting a focusing position based on the relative parameters of the objects corresponding to the first range in the acquired data and controlling the camera shooting assembly based on the focusing position.
8. The electronic device of claim 7, wherein the third processing module is specifically configured to:
determining a living object corresponding to a heartbeat and/or respiratory frequency which is the same as a preset heartbeat and/or respiratory frequency as a target object, and predicting a focusing position based on the relative speed of the target object.
9. The electronic device of claim 8, wherein the third processing module is further to:
and predicting an in-focus position based on the target object corresponding to the same relative speed as a preset speed in the case where the target object is plural.
10. The electronic device of claim 7 or 8, wherein the third processing module is further to:
predicting an in-focus position based on the first range and the relative speed.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202121776632.9U CN216356893U (en) | 2021-07-30 | 2021-07-30 | Electronic equipment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202121776632.9U CN216356893U (en) | 2021-07-30 | 2021-07-30 | Electronic equipment |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN216356893U true CN216356893U (en) | 2022-04-19 |
Family
ID=81159450
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202121776632.9U Active CN216356893U (en) | 2021-07-30 | 2021-07-30 | Electronic equipment |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN216356893U (en) |
-
2021
- 2021-07-30 CN CN202121776632.9U patent/CN216356893U/en active Active
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US6762709B2 (en) | Radar system, method of obtaining image, control method based on image data and monitoring method using milliwaves | |
| TWI767373B (en) | A method, a computer program product, an apparatus and a frequency-modulated continuous-wave radar system | |
| US8684934B2 (en) | Adaptively performing clutter filtering in an ultrasound system | |
| US10631820B2 (en) | Ultrasound diagnostic imaging apparatus and ultrasound image display method | |
| EP2972474A1 (en) | Multi-sensor surveillance system for monitoring a space and detection of objects | |
| KR102203928B1 (en) | Method for detecting position of micro robot using ultra wiide band impulse radar and therefore device | |
| CN112184749A (en) | Moving target tracking method based on video SAR cross-domain joint | |
| CN112419405B (en) | Target tracking joint display method, security system and electronic equipment | |
| US12241977B1 (en) | Passive human detection method and apparatus, device, and medium | |
| JP2008164545A (en) | Moving target detection device, moving target detection method, and moving target detection program | |
| IL265930A (en) | Detection apparatus and method | |
| CN114767074B (en) | Vital sign measuring method, equipment and storage medium | |
| JP2021001735A (en) | Sensor device and sensing method | |
| CN119355666A (en) | Target detection method and device, integrated circuit, electromagnetic wave sensor and terminal equipment | |
| WO2023083164A1 (en) | Target tracking method and apparatus, signal fusion method and apparatus, and terminal and storage medium | |
| CN109298417B (en) | Building internal structure detection method and device based on radar signal processing | |
| CN116500571A (en) | Method and device for detecting living body target, radar and storage medium | |
| Li et al. | Indoor multihuman device-free tracking system using multiradar cooperative sensing | |
| CN110554378A (en) | Single-channel Doppler radar radial motion direction identification method and device | |
| CN111316126A (en) | Target detection method, radar, vehicle, and computer-readable storage medium | |
| CN216356893U (en) | Electronic equipment | |
| CN119861369A (en) | Three-dimensional positioning life detection radar system | |
| Jia et al. | A novel approach to target localization through unknown walls for through-the-wall radar imaging | |
| CN220019916U (en) | Human body detection system | |
| US11147539B2 (en) | Methods and systems for blood speckle imaging |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| GR01 | Patent grant | ||
| GR01 | Patent grant |