[go: up one dir, main page]

CN113031265B - Split AR display device and display method - Google Patents

Split AR display device and display method Download PDF

Info

Publication number
CN113031265B
CN113031265B CN202110163539.9A CN202110163539A CN113031265B CN 113031265 B CN113031265 B CN 113031265B CN 202110163539 A CN202110163539 A CN 202110163539A CN 113031265 B CN113031265 B CN 113031265B
Authority
CN
China
Prior art keywords
signal
module
display
environment
audio
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110163539.9A
Other languages
Chinese (zh)
Other versions
CN113031265A (en
Inventor
翁志彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Pimax Intelligent Technology Co.,Ltd.
Xiaopai Technology Quzhou Co ltd
Original Assignee
Hangzhou Xiaopai Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Xiaopai Intelligent Technology Co ltd filed Critical Hangzhou Xiaopai Intelligent Technology Co ltd
Priority to CN202110163539.9A priority Critical patent/CN113031265B/en
Publication of CN113031265A publication Critical patent/CN113031265A/en
Application granted granted Critical
Publication of CN113031265B publication Critical patent/CN113031265B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application relates to a split AR display device and a display method, wherein the method comprises the following steps: the computing processing module acquires environment information of the environment where the AR display equipment is located through the information acquisition module, processes the environment information to obtain environment signals, transmits the environment signals to the computing processing device through the interface module, processes the environment signals to obtain first display signals, the DP-MIPI interface element processes the first display signals to obtain second display signals, the computing processing module controls logic operation driven by the optical waveguide to process the second display signals into image signals, and the image signals are displayed on the display element.

Description

Split AR display device and display method
Technical Field
The application relates to the field of augmented reality, in particular to a split-type AR display device and a display method.
Background
Along with the rapid development of the augmented reality technology (Augmented Reality, abbreviated as AR), various types of AR glasses are released by various large consumer electronic manufacturers, the AR glasses are used as intelligent electronic equipment which can be worn in daily travel, the wearing experience and the appearance requirements of consumers are improved, and the problems of larger appearance, heavier weight and poorer wearing experience of the integrated AR glasses in the market at present are solved; the split glasses have thick and heavy glasses display parts, complex hardware design and high cost of the display driving parts, and are hardly accepted by mass consumers.
At present, no effective solution has been proposed for the problems of heavy weight of AR glasses and high cost of display driving part in the related art.
Disclosure of Invention
The embodiment of the application provides a split-type AR display device and a display method, which are used for at least solving the problems of heavy weight and high cost of a display driving part of AR glasses in the related art.
In a first aspect, an embodiment of the present application provides a split-type AR display device, where the AR display device includes a display apparatus and an operation processing apparatus, where the display apparatus includes an information acquisition module, a calculation processing module, a display module, and an interface module, and the display apparatus is connected with the operation processing apparatus through the interface module;
the computing processing module acquires environment information of the environment where the AR display equipment is located through the information acquisition module;
the computing processing module processes the environmental information to obtain an environmental signal;
the computing processing module transmits the environment signal to the computing processing device through the interface module;
the operation processing device processes the environmental signal to obtain a first display signal;
the operation processing device transmits the first display signal to the display module through the interface module and displays the first display signal.
In some embodiments, the information acquisition module includes an inertial sensing element, an optical sensing element, an audio sensing module, and a camera module, and the acquiring, by the information acquisition module, environmental information of an environment where the AR display device is located includes:
the inertial sensing element obtains motion information of the AR display device;
the optical sensing element acquires ambient light information of an environment in which the AR display device is located;
the audio sensing module acquires environmental audio information of the environment where the AR display equipment is located;
and the camera module acquires environment image information of the environment where the AR display equipment is located.
In some embodiments, the interface module includes a USB hub element, a Type-C interface element, and the computing processing module transmitting the environmental signal to the computing processing device through the interface module includes: the computing processing module transmits the environment signal to the USB hub element, and the USB hub element processes the environment signal and transmits the environment signal to the computing processing device through the Type-C interface element.
In some embodiments, the interface module further includes a signal switching control module, where the signal switching control module includes a DP-MIPI interface element, and the operation processing device transmits the first display signal to the display module for display through the interface module includes:
the operation processing device transmits the first display signal to the DP-MIPI interface element;
the computing processing module processes the first display signal through the DP-MIPI interface element to obtain a second display signal;
the computing processing module transmits the second display signal to the display module through the DP-MIPI interface element;
the computing processing module processes the second display signal into an image signal through the display module and displays the image signal.
In some embodiments, the audio sensing module includes an audio acquisition element, an audio playing element, and an audio control element, and the audio sensing module acquires environmental audio information of an environment where the AR display device is located, including:
the audio control element acquires environment audio information of the environment where the AR display device is located through the audio acquisition element;
the audio control element receives the audio signal of the interface module and plays the audio signal through the audio playing element.
In some embodiments, the camera module includes an RGB image capturing element, a first image capturing element, and a second image capturing element, and the acquiring, by the camera module, environment image information of an environment in which the AR display device is located includes:
the RGB camera element acquires video image information of the environment where the AR display device is located in real time;
the first image pickup element and the second image pickup element acquire spatial information of the environment where the AR display device is located in real time.
In some embodiments, the display module includes an optical waveguide drive and display element, and the computing processing module processes the second display signal into an image signal and displays the image signal via the display module includes: the calculation processing module processes the second display signal into an image signal through controlling the logic operation of the optical waveguide drive, and displays the image signal on the display element.
In a second aspect, an embodiment of the present application provides a split-type AR display method, where the AR display method includes:
the computing processing module acquires environment information of the environment where the AR display equipment is located through the information acquisition module;
the computing processing module processes the environmental information to obtain an environmental signal;
the computing processing module transmits the environment signal to an operation processing device through an interface module;
the operation processing device processes the environmental signal to obtain a first display signal;
the operation processing device transmits the first display signal to the display module through the interface module and displays the first display signal.
In some embodiments, the interface module further includes a signal switching control module, where the signal switching control module includes a DP-MIPI interface element, and the operation processing device transmits the first display signal to the display module for display through the interface module includes:
the operation processing device transmits the first display signal to the DP-MIPI interface element;
the computing processing module processes the first display signal through the DP-MIPI interface element to obtain a second display signal;
the computing processing module transmits the second display signal to the display module through the DP-MIPI interface element;
the computing processing module processes the second display signal into an image signal through the display module and displays the image signal.
In some embodiments, the display module includes an optical waveguide drive and display element, and the computing processing module processes the second display signal into an image signal and displays the image signal via the display module includes: the calculation processing module processes the second display signal into an image signal through controlling the logic operation of the optical waveguide drive, and displays the image signal on the display element.
Compared with the related art, the split AR display device and the method provided by the embodiment of the application acquire the environmental information of the environment where the AR display device is located through the information acquisition module, the computing processing module processes the environmental information to obtain the environmental signal, the environmental signal is transmitted to the computing processing device through the interface module, the computing processing device processes the environmental signal to obtain the first display signal, the DP-MIPI interface element processes the first display signal to obtain the second display signal, the computing processing module controls the logic operation driven by the optical waveguide to process the second display signal into the image signal and display the image signal on the display element, the problems that the weight of the AR glasses is heavy and the cost of the display driving part is high are solved, and the weight and the cost of the AR glasses are reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
FIG. 1 is a block diagram of a split AR display device according to an embodiment of the present application;
FIG. 2 is a specific block diagram of a split AR display device according to an embodiment of the present application;
FIG. 3 is a specific block diagram of a split AR display device according to an embodiment of the present application;
FIG. 4 is a block diagram III of a specific architecture of a split AR display device according to an embodiment of the present application;
FIG. 5 is a block diagram of a specific architecture of a split AR display device according to an embodiment of the present application;
FIG. 6 is a block diagram five of a specific architecture of a split AR display device according to an embodiment of the present application;
FIG. 7 is a block diagram six of a specific architecture of a split-type AR display device according to an embodiment of the present application;
fig. 8 is a flowchart of a split type AR display method according to an embodiment of the present application.
Description of the drawings: 11. a display device; 12. an arithmetic processing device; 13. an information acquisition module; 14. a calculation processing module; 15. a display module; 16. an interface module; 21. an inertial sensing element; 22. an optical sensing element; 23. an audio sensing module; 24. a camera module; 31. a USB hub element; 32. Type-C interface element; 41. a signal switching control module; 51. an audio acquisition element; 52. an audio playing element; 53. an audio control element; 61. an RGB image pickup element; 62. a first image pickup element; 63. a second image pickup element; 71. optical waveguide driving; 72. a display element.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described and illustrated below with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden on the person of ordinary skill in the art based on the embodiments provided herein, are intended to be within the scope of the present application.
It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is possible for those of ordinary skill in the art to apply the present application to other similar situations according to these drawings without inventive effort. Moreover, it should be appreciated that while such a development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as having the benefit of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by those of ordinary skill in the art that the embodiments described herein can be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar terms herein do not denote a limitation of quantity, but rather denote the singular or plural. The terms "comprising," "including," "having," and any variations thereof, are intended to cover a non-exclusive inclusion; for example, a process, method, apparatus, article, or apparatus that comprises a list of steps or modules (elements) is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in this application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as used herein refers to two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The terms "first," "second," "third," and the like, as used herein, are merely distinguishing between similar objects and not representing a particular ordering of objects.
The embodiment of the application provides a split-type AR display device, fig. 1 is a structural block diagram of the split-type AR display device according to the embodiment of the application, as shown in fig. 1, the AR display device includes a display apparatus 11 and an operation processing apparatus 12, where the display apparatus 11 includes an information acquisition module 13, a calculation processing module 14, a display module 15 and an interface module 16, the display apparatus 11 is connected with the operation processing apparatus 12 through the interface module 16, it is to be noted that a built-in power supply is configured in the display apparatus 11, the operation processing apparatus 12 may be a device with a certain operation capability, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, etc., and meanwhile the operation processing apparatus 12 may be externally connected with a power supply, or may serve as an externally connected power supply of the display apparatus 11;
the computing processing module 14 acquires environmental information of the environment where the AR display device is located through the information acquisition module 13;
the computing processing module 14 processes the environmental information to obtain an environmental signal;
the computing processing module 14 transmits the environmental signal to the computing processing device 12 through the interface module 16;
the arithmetic processing device 12 processes the environmental signal to obtain a first display signal;
the arithmetic processing device 12 transmits the first display signal to the display module 15 through the interface module 16 and displays the first display signal.
According to the split AR display device provided by the embodiment of the application, the information acquisition module 13 acquires the environmental information of the environment where the AR display device is located, the calculation processing module 14 processes the environmental information to obtain an environmental signal, the environmental signal is transmitted to the operation processing device 12 through the interface module 16, the operation processing device 12 processes the environmental signal to obtain a first display signal, the operation processing device 12 transmits the first display signal to the display module 15 through the interface module 16 and displays the first display signal, and because the display device 11 and the operation processing device 12 of the AR display device are separated, the operation processing device 12 can be externally connected with a power supply and also can serve as an external power supply of the display device 11, a user can independently update and upgrade hardware which needs to be updated without integral replacement, the problem that the AR glasses are high in updating cost is solved, and the cruising ability and performance of the AR display device are improved
In some embodiments, fig. 2 is a specific block diagram of a split-type AR display device according to an embodiment of the present application, as shown in fig. 2, the information acquisition module 13 includes an inertial sensing element 21, an optical sensing element 22, an audio sensing module 23, and a camera module 24, the inertial sensing element 21 specifically includes a gyroscope, an accelerometer, and a magnetometer, and other related force detection elements, where the acquiring, by the calculation processing module 14, environmental information of an environment where the AR display device is located through the information acquisition module 13 includes:
the inertial sensing element 21 acquires motion information of the AR display device;
the optical sensing element 22 acquires ambient light information of the environment in which the AR display device is located;
the audio sensing module 23 acquires environmental audio information of the environment where the AR display device is located;
the camera module 24 acquires environmental image information of the environment in which the AR display device is located.
In some embodiments, fig. 3 is a specific block diagram of a split Type AR display device according to an embodiment of the present application, as shown in fig. 3, the interface module 16 includes a USB HUB element 31, a Type-C interface element 32, the USB HUB element 31 includes a USB HUB chip, the Type-C interface element 32 is a USB3.0 high-speed interface of Type-C, and the computing processing module 14 transmits an environmental signal to the computing processing device 12 through the interface module 16 includes: the computing processing module 14 transmits the environmental signal to the USB HUB chip, which processes the environmental signal and transmits it to the computing processing device 12 through the USB3.0 high-speed interface of Type-C.
In some embodiments, fig. 4 is a specific structural block diagram three of a split-type AR display device according to an embodiment of the present application, as shown in fig. 4, the interface module 16 further includes a signal switching control module 41, where the signal switching control module 41 specifically includes a DP-MIPI interface element, and the signal switching control module 41 specifically further includes a DP & USB interface element, and meanwhile, the DP-MIPI interface element is specifically a DP-MIPI interface chip, the DP & USB interface element is specifically a USB3.0 high-speed interface of DP & USB, and the operation processing device 12 transmits, through the interface module 16, a first display signal to the display module 15 for displaying includes:
the operation processing device 12 transmits the first display signal to a USB3.0 high-speed interface of DP & USB;
the computing processing module 14 converts the first display signal into a DP signal through a signal switching logic of a USB3.0 high-speed interface controlling DP & USB;
the calculation processing module 14 transmits the DP signal to the DP-MIPI interface chip and converts the DP signal into a MIPI signal;
the calculation processing module 14 transmits the MIPI signal to the display module 15;
the calculation processing module 14 processes the MIPI signal into an image signal through the display module 15 and displays the image signal.
In some embodiments, fig. 5 is a specific block diagram of a split type AR display device according to an embodiment of the present application, as shown in fig. 5, the Audio sensing module 23 includes an Audio collecting element 51, an Audio playing element 52, and an Audio control element 53, where the Audio control element 53 is specifically a USB Audio Codec chip, and the obtaining, by the Audio sensing module 23, environmental Audio information of an environment where the AR display device is located includes:
the USB Audio Codec chip acquires an environment MIC signal of the environment where the AR display device is located through an Audio acquisition element 51;
the USB Audio Codec chip receives the Audio signal from the interface module 16 and plays the Audio signal through the Audio playing element 52.
In some embodiments, fig. 6 is a specific block diagram of a split-type AR display device according to an embodiment of the present application, as shown in fig. 6, the camera module 24 includes an RGB image capturing element 61, a first image capturing element 62 and a second image capturing element 63, where the first image capturing element 62 is specifically a first 6DOF camera (referred to as a fisheye camera), the second image capturing element 63 is specifically a second 6DOF camera, and meanwhile, the camera module 24 further includes a USB bridge chip and a video synthesizer, where the acquiring, by the camera module 24, environmental image information of an environment where the AR display device is located includes:
the RGB image pickup element 61 acquires video image information of the environment in which the AR display device is located in real time;
the method comprises the steps that a first 6DOF camera and a second 6DOF camera acquire spatial information of an environment where AR display equipment is located in real time;
the video image information acquired in real time through the RGB image pickup element 61 and the spatial information of the real-time scanning environment of the first 6DOF camera and the second 6DOF camera (binocular fisheye camera) convert the environment image signal into a USB signal in the USB bridge chip, send the USB signal to the USB HUB chip, and then send the USB signal to the operation processing device 12 for processing, so that real-time millimeter-level positioning and map construction of the real three-dimensional space, a head 6DOF function and a gesture recognition function are realized.
In some embodiments, fig. 7 is a specific block diagram of a split-type AR display device according to an embodiment of the present application, as shown in fig. 7, the display module 15 includes an optical waveguide Driver 71 and a display element 72, the optical waveguide Driver 71 includes two digital micromirror device drivers (DMD Driver and PMIC), the display element 72 includes two Digital Micromirror Devices (DMD), and the calculating and processing module 14 processes the second display signal into an image signal through the display module 15 and displays the image signal includes: the calculation processing module 14 processes the second display signal into an image signal by controlling the logic operation of the digital micromirror device Driver (DMD Driver and PMIC) and displays the image signal on the Digital Micromirror Device (DMD), processes the second signal into an image signal by controlling the logic operation of the digital micromirror device Driver (DMD Driver and PMIC) and displays the image signal on the Digital Micromirror Device (DMD), thereby solving the problem of higher cost of the display driving part of the AR glasses and reducing the cost of the AR glasses.
An embodiment of the present application provides a split type AR display method, and fig. 8 is a flowchart of the split type AR display method according to an embodiment of the present application, as shown in fig. 8, the method includes the following steps:
s802, the computing processing module 14 acquires environmental information of the environment where the AR display device is located through the information acquisition module 13;
s804, the computing processing module 14 processes the environmental information to obtain an environmental signal;
s806, the computing processing module 14 transmits the environmental signal to the computing processing device 12 through the interface module 16;
s808, the operation processing device 12 processes the environmental signal to obtain a first display signal, and transmits the first display signal to the USB3.0 high-speed interface of the DP & USB;
s810, the computing processing module 14 converts the first display signal into a DP signal through a signal switching logic of a USB3.0 high-speed interface of the control DP & USB;
s812, the calculation processing module 14 transmits the DP signal to the DP-MIPI interface chip and converts the DP signal into a MIPI signal;
s814, the calculation processing module 14 processes the MIPI signal into an image signal by controlling the logic operation of the digital micromirror device Driver (DMD Driver and PMIC), and displays the image signal on the Digital Micromirror Device (DMD);
in this embodiment of the present application, through steps S802 to S814, the information acquisition module 13 acquires environmental information of the environment where the AR display device is located, the calculation processing module 14 processes the environmental information to obtain an environmental signal, and transmits the environmental signal to the operation processing device 12 through the interface module 16, the operation processing device 12 processes the environmental signal to obtain a first display signal, the calculation processing module 14 converts the first display signal into a DP signal through controlling the signal switching logic of the USB3.0 high-speed interface of DP & USB, and then converts the DP signal into an MIPI signal through the DP-MIPI interface chip, and finally the calculation processing module 14 processes the MIPI signal into an image signal through controlling the logic operation of the digital micromirror device Driver and the PMIC, and displays the image signal on the Digital Micromirror Device (DMD), thereby solving the problems of heavy weight of the AR glasses and high cost of the display driving portion, and realizing the reduction of the weight and cost of the AR glasses.
It should be understood by those skilled in the art that the technical features of the above-described embodiments may be combined in any manner, and for brevity, all of the possible combinations of the technical features of the above-described embodiments are not described, however, they should be considered as being within the scope of the description provided herein, as long as there is no contradiction between the combinations of the technical features.
The above examples merely represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the invention. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application is to be determined by the claims appended hereto.

Claims (4)

1. The split AR display device is characterized by comprising a display device and an operation processing device, wherein the display device comprises an information acquisition module, a calculation processing module, a display module and an interface module, and the display device is connected with the operation processing device through the interface module;
the information acquisition module comprises an audio sensing module, wherein the audio sensing module comprises an audio acquisition element, an audio playing element and an audio control element; the computing processing module obtaining, through the information acquisition module, environmental information of an environment where the AR display device is located includes:
the audio control element acquires environment audio information of the environment where the AR display device is located through the audio acquisition element;
the audio control element receives the audio signal of the interface module and plays the audio signal through the audio playing element;
the computing processing module processes the environmental information to obtain an environmental signal;
the computing processing module transmits the environment signal to the computing processing device through the interface module;
the operation processing device processes the environmental signal to obtain a first display signal, and the first display signal is transmitted to a USB3.0 high-speed interface of DP & USB; the computing processing module converts the first display signal into a DP signal through controlling signal switching logic of a USB3.0 high-speed interface of the DP & USB;
the interface module comprises a signal switching control module, the signal switching control module comprises a DP-MIPI interface element, and the operation processing device transmits the DP signal to the DP-MIPI interface element; the computing processing module converts the DP signal into an MIPI signal through the DP-MIPI interface element; the computing processing module transmits the MIPI signal to the display module through the DP-MIPI interface element; the display module comprises an optical waveguide drive and a display element, wherein the optical waveguide drive comprises two digital micromirror device drive DMD drivers and an integrated power management circuit PMIC, the display element comprises two digital micromirror devices DMD, and the calculation processing module processes MIPI signals into image signals by controlling logic operation of the digital micromirror device drive DMD drivers and the integrated power management circuit PMIC and displays the image signals on the two digital micromirror devices DMD;
the interface module comprises a USB HUB element, wherein the USB HUB element comprises a USB HUB chip;
the information acquisition module further comprises: the camera module is used for acquiring environment image information of the environment where the AR display equipment is located;
the camera module comprises an RGB (red, green and blue) image pickup element, a first image pickup element and a second image pickup element, and the camera module acquires environment image information of the environment where the AR display device is located, and the method comprises the following steps: the RGB camera element acquires video image information of the environment where the AR display device is located in real time; the first image pickup element and the second image pickup element acquire space information of the environment where the AR display device is located in real time;
the first camera element is a first 6DOF camera, the second camera element is a second 6DOF camera, and the camera module further comprises a USB bridge chip and a video synthesizer; and the video image information obtained in real time through the RGB camera element and the spatial information of the real-time scanning environment of the first 6DOF camera and the second 6DOF camera convert the environment image signal into a USB signal in the USB bridge chip, send the USB signal to the USB HUB chip and transmit the USB signal to the operation processing device for processing.
2. The device of claim 1, wherein the information acquisition module includes an inertial sensing element, an optical sensing element, and the computing processing module obtains, via the information acquisition module, environmental information of an environment in which the AR display device is located includes:
the inertial sensing element obtains motion information of the AR display device;
the optical sensing element acquires ambient light information of an environment in which the AR display device is located.
3. The apparatus of claim 1, wherein the interface module comprises a USB hub element, a Type-C interface element, and wherein the computing processing module transmitting the environmental signal to the computing processing device through the interface module comprises: the computing processing module transmits the environment signal to the USB hub element, and the USB hub element processes the environment signal and transmits the environment signal to the computing processing device through the Type-C interface element.
4. A split-type AR display method, wherein the AR display method includes:
the information acquisition module comprises an audio sensing module, wherein the audio sensing module comprises an audio acquisition element, an audio playing element and an audio control element; the computing processing module obtaining the environmental information of the environment where the AR display device is located through the information acquisition module comprises the following steps:
the audio control element acquires environment audio information of the environment where the AR display device is located through the audio acquisition element;
the audio control element receives an audio signal of the interface module and plays the audio signal through the audio playing element; the computing processing module processes the environmental information to obtain an environmental signal;
the computing processing module transmits the environment signal to an operation processing device through an interface module;
the operation processing device processes the environmental signal to obtain a first display signal, and the first display signal is transmitted to a USB3.0 high-speed interface of DP & USB; the computing processing module converts the first display signal into a DP signal through controlling signal switching logic of a USB3.0 high-speed interface of the DP & USB;
the interface module comprises a signal switching control module, the signal switching control module comprises a DP-MIPI interface element, and the operation processing device transmits the DP signal to the DP-MIPI interface element; the computing processing module converts the DP signal into an MIPI signal through the DP-MIPI interface element; the computing processing module transmits the MIPI signal to a display module through the DP-MIPI interface element; the display module comprises an optical waveguide drive and a display element, wherein the optical waveguide drive comprises two digital micromirror device drive DMD drivers and an integrated power management circuit PMIC, the display element comprises two digital micromirror devices DMD, and the calculation processing module processes MIPI signals into image signals by controlling logic operation of the digital micromirror device drive DMD drivers and the integrated power management circuit PMIC and displays the image signals on the two digital micromirror devices DMD;
the interface module comprises a USB HUB element, wherein the USB HUB element comprises a USB HUB chip;
the information acquisition module further comprises: the camera module is used for acquiring environment image information of the environment where the AR display equipment is located;
the camera module comprises an RGB (red, green and blue) image pickup element, a first image pickup element and a second image pickup element, and the camera module acquires environment image information of the environment where the AR display device is located, and the method comprises the following steps: the RGB camera element acquires video image information of the environment where the AR display device is located in real time; the first image pickup element and the second image pickup element acquire space information of the environment where the AR display device is located in real time;
the first camera element is a first 6DOF camera, the second camera element is a second 6DOF camera, and the camera module further comprises a USB bridge chip and a video synthesizer; and the video image information obtained in real time through the RGB camera element and the spatial information of the real-time scanning environment of the first 6DOF camera and the second 6DOF camera convert the environment image signal into a USB signal in the USB bridge chip, send the USB signal to the USB HUB chip and transmit the USB signal to the operation processing device for processing.
CN202110163539.9A 2021-02-05 2021-02-05 Split AR display device and display method Active CN113031265B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110163539.9A CN113031265B (en) 2021-02-05 2021-02-05 Split AR display device and display method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110163539.9A CN113031265B (en) 2021-02-05 2021-02-05 Split AR display device and display method

Publications (2)

Publication Number Publication Date
CN113031265A CN113031265A (en) 2021-06-25
CN113031265B true CN113031265B (en) 2023-06-30

Family

ID=76460128

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110163539.9A Active CN113031265B (en) 2021-02-05 2021-02-05 Split AR display device and display method

Country Status (1)

Country Link
CN (1) CN113031265B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345064A (en) * 2013-07-16 2013-10-09 卫荣杰 Cap integrated with 3D identifying and 3D identifying method of cap
CN103345065A (en) * 2013-07-16 2013-10-09 江苏慧光电子科技有限公司 Wearable head up optical system
CN110892408A (en) * 2017-02-07 2020-03-17 迈恩德玛泽控股股份有限公司 Systems, methods, and apparatus for stereo vision and tracking

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8752963B2 (en) * 2011-11-04 2014-06-17 Microsoft Corporation See-through display brightness control
KR20170081351A (en) * 2016-01-04 2017-07-12 한국전자통신연구원 Providing apparatus for augmented reality service, display apparatus and providing system for augmented reality service comprising thereof
CN106019592A (en) * 2016-07-15 2016-10-12 中国人民解放军63908部队 Augmented reality optical transmission-type helmet mounted display pre-circuit and control method thereof
CN107240156B (en) * 2017-06-07 2019-07-23 武汉大学 A kind of high-precision outdoor augmented reality spatial information display system and method
CN107678351A (en) * 2017-10-24 2018-02-09 西安闻泰电子科技有限公司 Separate type VR equipment and MIPI Display Realization methods
CN109246423B (en) * 2018-08-16 2021-03-09 歌尔光学科技有限公司 VR equipment and VR equipment detection method
CN109243442A (en) * 2018-09-28 2019-01-18 歌尔科技有限公司 Sound monitoring method, device and wear display equipment
CN208766396U (en) * 2018-09-29 2019-04-19 北京悉见科技有限公司 A kind of wearable AR equipment and AR display system
CN109947683A (en) * 2019-04-26 2019-06-28 歌尔科技有限公司 a VR device
CN110623820A (en) * 2019-07-15 2019-12-31 电子科技大学 A wearable intelligent blind guide device
CN110717994A (en) * 2019-10-21 2020-01-21 联想(北京)有限公司 Method for realizing remote video interaction and related equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345064A (en) * 2013-07-16 2013-10-09 卫荣杰 Cap integrated with 3D identifying and 3D identifying method of cap
CN103345065A (en) * 2013-07-16 2013-10-09 江苏慧光电子科技有限公司 Wearable head up optical system
CN110892408A (en) * 2017-02-07 2020-03-17 迈恩德玛泽控股股份有限公司 Systems, methods, and apparatus for stereo vision and tracking

Also Published As

Publication number Publication date
CN113031265A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
AU2020250124B2 (en) Image processing method and head mounted display device
US9852506B1 (en) Zoom and image capture based on features of interest
WO2020017927A1 (en) System and method for hybrid eye tracker
JP7117451B2 (en) Information Display Device Using Gaze and Gesture
CN113223129B (en) Image rendering method, electronic equipment and system
CN111479148B (en) Wearable device, glasses terminal, processing terminal, data interaction method and medium
WO2021018070A1 (en) Image display method and electronic device
US11140330B2 (en) Apparatus for stabilizing digital image, operating method thereof, and electronic device having the same
US20210409673A1 (en) Electronic device for adjusting position of content displayed on display based on ambient illuminance and method for operating same
US20250093662A1 (en) Dual system on a chip eyewear
CN118312117A (en) Screen projection display method and electronic device
CN116266371A (en) Sparse RGB filter hardware accelerator
CN113031265B (en) Split AR display device and display method
WO2022149829A1 (en) Wearable electronic device, and input structure using motion sensor
US20240275937A1 (en) Dual system on a chip eyewear
WO2021172941A1 (en) Image streaming method and electronic device supporting same
CN116563740A (en) Control method and device based on augmented reality, electronic equipment and storage medium
EP4476600A1 (en) Dual system on a chip eyewear having a mipi bridge
US20240119683A1 (en) Electronic device and method for providing ar information using watch face image
WO2025198109A1 (en) Adaptive foveation processing and rendering in video see-through (vst) extended reality (xr)
WO2024041429A1 (en) Task connection method, device and system
US20220414944A1 (en) Display terminal device
CN117193915A (en) Terminal control method, device, electronic equipment and storage medium
WO2025055594A1 (en) Display method and electronic device
KR20250005823A (en) Wearable device, method, and non-transitory computer readable storage medium processing camera image based on region of interest

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 310000 room 208, building 1, 1818-1, Wenyi West Road, Yuhang street, Yuhang District, Hangzhou City, Zhejiang Province

Patentee after: Hangzhou Pimax Intelligent Technology Co.,Ltd.

Address before: 310000 room 208, building 1, 1818-1, Wenyi West Road, Yuhang street, Yuhang District, Hangzhou City, Zhejiang Province

Patentee before: Hangzhou Xiaopai Intelligent Technology Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231020

Address after: 324404 1-4 on the first floor, 1-1, 1-2, 1-4 on the second floor, Block 1, Robot Industrial Park, Guangji Road, Longyou Economic Development Zone, Mohuan Township, Longyou County, Quzhou City, Zhejiang Province

Patentee after: Xiaopai Technology (Quzhou) Co.,Ltd.

Address before: 310000 room 208, building 1, 1818-1, Wenyi West Road, Yuhang street, Yuhang District, Hangzhou City, Zhejiang Province

Patentee before: Hangzhou Pimax Intelligent Technology Co.,Ltd.