CN118131904A - Equipment quality inspection method, device and system based on AR (augmented reality) glasses - Google Patents
Equipment quality inspection method, device and system based on AR (augmented reality) glasses Download PDFInfo
- Publication number
- CN118131904A CN118131904A CN202410141224.8A CN202410141224A CN118131904A CN 118131904 A CN118131904 A CN 118131904A CN 202410141224 A CN202410141224 A CN 202410141224A CN 118131904 A CN118131904 A CN 118131904A
- Authority
- CN
- China
- Prior art keywords
- quality inspection
- glasses
- inspected
- real
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/903—Querying
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06395—Quality analysis or management
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/20—Administration of product repair or maintenance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Resources & Organizations (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Databases & Information Systems (AREA)
- General Business, Economics & Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Operations Research (AREA)
- Health & Medical Sciences (AREA)
- Development Economics (AREA)
- Educational Administration (AREA)
- Software Systems (AREA)
- Manufacturing & Machinery (AREA)
- Multimedia (AREA)
- Computing Systems (AREA)
- Primary Health Care (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Game Theory and Decision Science (AREA)
- Artificial Intelligence (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Optics & Photonics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application discloses an equipment quality inspection method, device and system based on AR (augmented reality) glasses, which are used for improving equipment quality inspection efficiency. The scheme provided by the embodiment of the application comprises the following steps: the method comprises the steps of collecting appearance images of equipment to be inspected through cameras of AR glasses; acquiring quality inspection information matched with the appearance image from a network side server through a pre-constructed communication link, wherein the network side server stores quality inspection information of at least one device, and the quality inspection information comprises a virtual model of the device to be inspected and quality inspection flow information corresponding to at least one region to be inspected in the virtual model; comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture; and displaying quality inspection flow information corresponding to the target to-be-inspected area in a display area of the AR glasses.
Description
Technical Field
The application relates to the technical field of augmented reality, in particular to an equipment quality inspection method, device and system based on AR (augmented reality) glasses.
Background
In the field of production and manufacturing, the running state of equipment can be detected in a quality inspection mode, so that faults of the equipment can be found out in time and repaired, and the equipment is ensured to be in a normal running state.
The quality inspection modes of different equipment are often different, and if inspection is carried out in a manual mode, the quality inspection effect depends on experience of quality inspection personnel, and the quality inspection result is subjectively influenced. Under the condition that the types of the equipment to be detected are more or the novel equipment to be detected appears, the equipment quality detection efficiency is often lower due to the knowledge of quality detection personnel on the quality detection regulation system, the grasp of the operation standard and the experience of the equipment.
How to improve the quality inspection efficiency of equipment is a technical problem to be solved by the application.
Disclosure of Invention
The embodiment of the application aims to provide an equipment quality inspection method, device and system based on AR (augmented reality) glasses, which are used for improving the equipment quality inspection efficiency.
In a first aspect, an apparatus quality inspection method based on AR glasses is provided, which is applied to augmented reality AR glasses, and includes:
the method comprises the steps of collecting appearance images of equipment to be inspected through cameras of AR glasses;
Acquiring quality inspection information matched with the appearance image from a network side server through a pre-constructed communication link, wherein the network side server stores quality inspection information of at least one device, and the quality inspection information comprises a virtual model of the device to be inspected and quality inspection flow information corresponding to at least one region to be inspected in the virtual model;
Comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture;
And displaying quality inspection flow information corresponding to the target to-be-inspected area in a display area of the AR glasses.
In a second aspect, an apparatus quality inspection device based on AR glasses is provided, including:
the acquisition module acquires an appearance image of the equipment to be inspected through a camera of the AR glasses;
The acquisition module acquires quality inspection information matched with the appearance image from a network side through a pre-constructed communication link, wherein the quality inspection information comprises a virtual model of the equipment to be inspected and quality inspection flow information corresponding to at least one area to be inspected in the virtual model;
The determining module is used for comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture;
And the display module displays the quality inspection flow information corresponding to the target to-be-inspected area in the display area of the AR glasses.
In a third aspect, there is provided an electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program implementing the steps of the method as in the first aspect when executed by the processor.
In a fourth aspect, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method as in the first aspect.
In the embodiment of the application, firstly, an appearance image of equipment to be inspected is acquired through a camera of an AR (augmented reality) glasses; then, quality inspection information matched with the appearance image is obtained from a network side server through a pre-constructed communication link, the network side stores quality inspection information of at least one device, and the quality inspection information comprises a virtual model of the device to be inspected and quality inspection flow information corresponding to at least one region to be inspected in the virtual model; then, comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture; and finally, displaying quality inspection flow information corresponding to the target to-be-inspected area in the display area of the AR glasses. The quality inspection flow information is displayed through the display area, so that quality inspection personnel can intuitively see the quality inspection flow information corresponding to the to-be-inspected area of the to-be-inspected equipment in the display area after wearing the AR glasses. In the process that quality inspection personnel execute quality inspection, virtual models of equipment to be inspected and quality inspection flow information corresponding to the areas to be inspected can be obtained efficiently through the AR glasses, wherein the virtual models can accurately position target areas to be inspected, and accordingly the matching degree of the displayed quality inspection flow information and the areas to be inspected in the sight range of the quality inspection personnel is improved, and the accuracy of the quality inspection flow is improved. Quality control flow information through the display of AR glasses can be convenient for quality control personnel standardized quality control operation, improves quality control efficiency.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is one of the flow diagrams of an AR glasses-based device quality inspection method according to an embodiment of the present application;
FIG. 2a is a second flow chart of an AR glasses-based device quality inspection method according to an embodiment of the present application;
FIG. 2b is a schematic illustration of an application scenario of a computer vision based tracking registration technique according to an embodiment of the present application;
FIG. 3 is a third flow chart of an AR glasses-based device quality inspection method according to an embodiment of the present application;
FIG. 4 is a flow chart of a method for quality inspection of equipment based on AR glasses according to an embodiment of the present application;
FIG. 5a is a flowchart of a method for quality inspection of AR glasses-based device according to an embodiment of the present application;
FIG. 5b is a schematic diagram of a gesture recognition scenario based on TOF technology according to one embodiment of the present application;
FIG. 5c is a flow diagram of recognizing speech instructions based on a pre-trained semantic recognition model according to one embodiment of the present application;
FIG. 5d is a schematic diagram of an interaction flow based on human-machine interaction instructions according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an AR-based device quality inspection apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an AR glasses-based device quality inspection system according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application. The reference numerals in the present application are only used for distinguishing the steps in the scheme, and are not used for limiting the execution sequence of the steps, and the specific execution sequence controls the description in the specification.
The augmented reality (Augmented Reality, AR) technology is a technology for fusing virtual information with the real world for display, and can be applied to various fields such as multimedia, three-dimensional modeling, real-time tracking and registration, intelligent interaction, sensing and the like. The AR technology can apply virtual information such as characters, images, three-dimensional models, music, video and the like generated by a computer to the real world after simulation, so that the virtual information and the real world are mutually complemented, and the real world is enhanced by the virtual information.
In order to solve the problems in the prior art, an embodiment of the present application provides an equipment quality inspection method based on AR glasses, which is applied to augmented reality AR glasses, as shown in fig. 1, and the scheme provided by the embodiment of the present application includes the following steps:
S11: and the appearance image of the equipment to be inspected is acquired through the cameras of the AR glasses.
In this step, the AR glasses may be a wearable device with augmented reality technology. In practical application, after wearing AR glasses, quality inspectors can control the AR glasses to perform appearance image acquisition through various instructions such as physical or virtual keys, voice instructions, gesture instructions and the like. The AR glasses can shoot through the camera, the direction of the camera can be consistent with the direction of the visual field of the quality inspector, and therefore the collected appearance image is an image in the visual field range of the quality inspector.
In practical applications, this step may be actively triggered by a quality inspector, for example, the quality inspector controls the AR glasses to perform image acquisition through instructions. Or the step may be automatically performed based on image resolution. For example, as the line of sight of the quality inspector moves, the AR glasses analyze objects within the line of sight of the quality inspector in real time, and if there may be a device to be inspected within the line of sight, the AR glasses perform image photographing through the camera. Further, it is possible to determine whether or not the photographed image is an external image of the device to be inspected by displaying the image and querying the quality inspector.
S12: and acquiring quality inspection information matched with the appearance image from a network side server through a pre-constructed communication link, wherein the network side server stores quality inspection information of at least one device, and the quality inspection information comprises a virtual model of the device to be inspected and quality inspection flow information corresponding to at least one region to be inspected in the virtual model.
The above-mentioned communication link may be used for the AR glasses to connect to a network side server, which may be a cloud server. In practical application, the AR glasses may construct the above communication link with the network side server through multiple connection modes such as beidou, bluetooth, wiFi (wireless network communication technology), 4G (fourth generation mobile communication technology), 5G (fifth generation mobile communication technology), and the like.
In this step, the AR glasses acquire quality inspection information of the device to be inspected from the network side server. The appearance image and the quality inspection information shot by the AR glasses can be associated in various ways. For example, if the surface of the device to be inspected displays the identifier of the device to be inspected in the form of a two-dimensional code, a bar code, a serial number, etc. Then, the AR glasses can query and acquire the quality inspection flow information of the equipment to be inspected from the network side server through the identification by shooting the image containing the identification.
Optionally, a scene image is collected through a camera of the AR glasses, and point cloud information of the scene is collected through a point cloud identification algorithm based on the scene image, wherein the point cloud information can be used for identifying scene characteristics and can be used for identifying the scene. And identifying the scene based on the point transport information, so that the accuracy of the acquired quality inspection information can be improved.
In this example, the device to be detected may be uniquely identified, so as to improve the matching degree between the determined quality inspection flow information and the actual device to be detected. Specifically, besides the coded identification such as the two-dimensional code, the equipment to be detected can be identified through various identification modes such as other two-dimensional image identification, point cloud identification, optical character identification and the like, so that the accuracy of identifying the equipment to be detected is improved, and the accuracy of the determined quality inspection flow information is improved.
Optionally, the AR glasses may analyze and optimize the captured appearance image. Specifically, the feature processing may be performed on the appearance image, which includes, for example, weakening an environmental background area, improving the sharpness of an area where the device to be inspected is located, and the like. Then, the AR glasses can query the matched quality inspection information in the network side server based on the optimized appearance image.
The network side server may prestore quality inspection information of at least one device, where the quality inspection information includes a virtual model of the corresponding quality inspection dual device and quality inspection flow information corresponding to at least one to-be-inspected area in the virtual model. The virtual model may be a three-dimensional virtual model of the device to be inspected, or may be a two-dimensional virtual model of the device to be inspected at multiple angles. The virtual model comprises at least one region to be detected, and any region to be detected corresponds to a group of quality inspection flow information. Based on the corresponding relation, the quality inspection flow information is used for expressing the corresponding information such as quality inspection operation flow, quality inspection standard and the like of the region to be inspected.
Optionally, the quality inspection flow information may further include a quality inspection operation safety prompt for improving quality inspection safety of the device.
S13: and comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture.
In practical application, the sight line range of the quality inspection personnel often changes frequently, and in the step, the real-time picture acquired by the camera is compared with the virtual model to identify whether a target to-be-inspected area needing quality inspection exists in the sight line range of the quality inspection personnel.
The comparison of the acquired picture and the virtual model can be realized through an image processing technology. For example, image comparison is performed by a pre-trained image processing model, and the result output by the model is used to represent the similarity between at least part of the region in the real-time picture acquired by the camera and the virtual model. And if the output result of the model shows that the real-time picture is matched with the virtual model, determining the region matched with the virtual model in the real-time picture as a target region to be detected.
Optionally, the AR glasses may also create a two-dimensional or three-dimensional real-time virtual image based on the acquired real-time frame, and further compare the created real-time virtual image with a virtual model of the network side server, so as to determine a target to-be-inspected area included in the real-time frame.
S14: and displaying quality inspection flow information corresponding to the target to-be-inspected area in a display area of the AR glasses.
In the step, real-time images acquired by a camera are fused with virtual information of quality inspection flow information based on an AR technology, and the quality inspection flow information corresponding to a target to-be-inspected area is displayed on the basis of a real scene image through a display area of AR glasses. Based on the AR technology, the existing image in the quality inspection scene can be identified through image feature point matching in the step, and then the spatial position of the image is calculated according to the point information of the identified image features, and the spatial position can be used for realizing fusion between the virtual image and the real image, so that in the sight of quality inspection personnel, the virtual image of the quality inspection flow information and the real-world equipment to be inspected are relatively static, and the fusion of the virtual information and the real world is realized visually.
In practical application, the target to-be-detected area in the real-time picture can be highlighted through various visual expression modes such as area highlighting, arrow pointing, edge strengthening and the like. Based on the position of the target to-be-detected area, displaying the quality detection flow information corresponding to the target to-be-detected area in various forms such as characters, static or dynamic images, videos and the like through a prompt box and the like.
Optionally, a virtual model to which the target to-be-detected area belongs can be displayed in the display area of the AR glasses, and the position of the target to-be-detected area is highlighted on the virtual model, so that quality inspectors can intuitively see the position of the target to-be-detected area relative to the to-be-detected equipment. The virtual model can be rendered and enhanced according to actual requirements so as to optimize the display effect of the virtual model.
In addition, if the quality inspection flow steps of the equipment to be inspected are more, the quality inspection flow and the corresponding areas to be inspected can be displayed in sequence based on the sequence of the quality inspection steps, so that quality inspection personnel can conveniently and sequentially detect the areas to be inspected of the equipment to be inspected according to the quality inspection steps, and the normalization of the quality inspection flow is improved.
In the scheme provided by the embodiment of the application, firstly, the appearance image of the equipment to be inspected is acquired through the camera of the AR glasses; then, quality inspection information matched with the appearance image is obtained from a network side server through a pre-constructed communication link, the network side stores quality inspection information of at least one device, and the quality inspection information comprises a virtual model of the device to be inspected and quality inspection flow information corresponding to at least one region to be inspected in the virtual model; then, comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture; and finally, displaying quality inspection flow information corresponding to the target to-be-inspected area in the display area of the AR glasses. The quality inspection flow information is displayed through the display area, so that quality inspection personnel can intuitively see the quality inspection flow information corresponding to the to-be-inspected area of the to-be-inspected equipment in the display area after wearing the AR glasses. In the process that quality inspection personnel execute quality inspection, virtual models of equipment to be inspected and quality inspection flow information corresponding to the areas to be inspected can be obtained efficiently through the AR glasses, wherein the virtual models can accurately position target areas to be inspected, and accordingly the matching degree of the displayed quality inspection flow information and the areas to be inspected in the sight range of the quality inspection personnel is improved, and the accuracy of the quality inspection flow is improved. Quality control flow information through the display of AR glasses can be convenient for quality control personnel standardized quality control operation, improves quality control efficiency.
Based on the solution provided in the foregoing embodiment, optionally, as shown in fig. 2a, before step S14, that is, before the displaying area of the AR glasses displays the quality inspection flow information corresponding to the target to-be-inspected area, the method further includes the following steps:
S21: and generating real scene enhancement information based on the quality inspection flow information corresponding to the target to-be-inspected area, wherein the real scene enhancement information comprises an area virtual image of the target to-be-inspected area and a flow virtual image of a corresponding quality inspection flow.
In this step, live-action enhancement information, which is flow information presented in the form of a virtual image, may be generated by determining the display content and determining the display position. The quality inspection flow information can be displayed in a text form, and the text content can be flexibly displayed in a mode of page scrolling or page turning.
The live-action enhancement information comprises an area virtual image and a flow virtual image, and the two virtual images can be generated by determining the display content of the virtual image and determining the display position of the virtual image.
The region virtual image can be generated based on an image of the target region to be detected and is used for highlighting the position of the target region to be detected. Specifically, the region virtual image can be identical to the image shape of the target to-be-detected region and overlap with the position of the target to-be-detected region, and is displayed in a mode of changing color and brightness, so that the position and the appearance shape of the target to-be-detected region are highlighted.
The flow virtual image is used for displaying quality inspection flow information corresponding to the target to-be-inspected area, and can be particularly displayed in various forms such as static or dynamic images, videos and characters. The display position of the flow virtual image can be close to the region virtual image, so that the association relationship between the target region to be detected and the flow virtual image is embodied. The region virtual image and the flow virtual image can be connected and displayed in an indication line mode, so that quality inspection personnel can check the region virtual image and the flow virtual image conveniently.
Under the condition that the number of the areas to be detected is large, the areas to be detected can be highlighted and displayed through the area virtual images, and the quality inspection flow information of the areas to be detected is intuitively displayed through the flow virtual images, so that the normalization of quality inspection execution by quality inspection staff is facilitated, and the quality inspection efficiency is improved.
In the step S14, displaying, in a display area of the AR glasses, quality inspection flow information corresponding to the target to-be-inspected area, including:
S22: and fusing the real-time picture acquired by the camera with the real-time enhancement information, and displaying the fused real-time picture through the display area of the AR glasses.
This step may be implemented based on tracking registration techniques. The tracking registration technique employs a synchronous positioning and three-dimensional space mapping (Simultaneous Localization AND MAPPING, SLAM) technique. SLAM technology can give AR glasses the perception of complex environments and the adaptation of dynamic scenes, which is also the understanding ability of AR glasses to environments. The SLAM technique can match the positions of the object and the virtual object in the real environment so that the spatial positions of the virtual object and the real environment in the eyes of the quality inspector cannot be changed along with the movement of the user.
The virtual object can be fused with the real environment through the tracking registration technology, and even if the observation angle or the position of the quality inspector is greatly changed, the relative position of the virtual object in the real environment can be kept stable, so that the quality inspector can feel that the virtual object exists in the real environment, and the virtual object is not visually different from the real object. Based on the quality inspection flow information in the fused live-action enhanced real-time picture can clearly show the quality inspection flow, and quality inspection personnel can conveniently and intuitively see which region of the equipment to be inspected needs to execute which quality inspection operation by combining with a real-world scene.
In the step, virtual and real vision fusion is realized based on real-time pictures acquired by the camera. In performing fusion on real-time pictures, a computer vision tracking registration technique may be employed. Wherein, the tracking registration of the computer vision can be realized by utilizing machine vision and a computer imaging algorithm. The working principle is that the surrounding scene is photographed in real time through the cameras of the AR glasses, various visual or deep learning algorithms are utilized for image feature point identification and matching, and then the coordinate relation between objects is established, so that the position stability of the virtual objects in the real environment is ensured, and the purpose of tracking and registering is achieved. The tracking registration mode has the advantages of good performance and wide universality, can be realized based on image acquisition equipment such as a camera and the like, and is beneficial to improving the portability of the AR glasses.
Fig. 2b shows a schematic view of an application scenario of a tracking registration technique based on computer vision. The positional relationship between the camera head and the object based on the O w three-dimensional coordinate system (including the X w、Yw、Zw coordinate axis) is shown in the dashed box on the left side of fig. 2 b. The position of the camera is O c, and the feature point of the object P located in x w,yw,zw is projected on the plane where uv is located, so that the position of the feature point or the relative positional relationship between the feature points can be represented by a vector. Three-dimensional registration of the object P can be achieved by performing feature point acquisition at a plurality of positions, for example, after feature point acquisition based on the left-hand dashed box in fig. 2b, the position of the camera is changed to the position shown by the right-hand dashed box in fig. 2b, and feature point acquisition is performed based on an O v three-dimensional coordinate system (including the X V、YV、ZV coordinate axis). Furthermore, three-dimensional registration of the object P is realized based on the feature point acquisition results of a plurality of different camera poses.
Furthermore, the computer vision tracking registration technology can be further divided according to different feature types, and specifically can be divided into a tracking registration mode based on artificial markers and a tracking registration mode based on natural feature points.
The tracking registration mode based on the artificial markers is realized based on machine learning, and is a mode of applying image recognition in augmented reality. Specifically, a classifier is set according to actual demands, a machine learning algorithm is used for training the visual angle templates of the learning target image under each angle, then the classifier is used for classifying, and the training result which is most in line with the templates is used as the recognition result. This method requires training learning of various changes of the target object in advance, storing as a memory block, and then performing recognition classification. Therefore, the tracking registration mode based on the artificial marker needs to sample each view angle of the target object in advance, the more the number of sampling points of the target object is set, the wider the effective view angle is, the more gesture angles can be matched, the corresponding recognition success rate is improved, and the better the tracking registration precision is. Optionally, in order to alleviate performance loads of the device such as storage, calculation, and the like, accuracy and speed balance setting can be performed according to the actually used device, so that tracking registration can be realized under the condition of device performance permission.
The tracking registration mode based on the natural characteristic points is a mode of tracking registration by utilizing a large number of natural characteristic points in the surrounding environment, and the mode utilizes natural characteristics of points, lines, planes, textures and the like of a real object to register without a specific hardware tracker or manually preset characteristic points. Therefore, the tracking registration mode based on the natural feature points is simpler, and the application scene is wider. The most important algorithm core of the method is to extract natural characteristic points which are not influenced by visual angle, scale and distance transformation in a real scene, and then track and register according to matching of the characteristic points.
Based on the solution provided in the foregoing embodiment, optionally, as shown in fig. 3, in step S22, the real-time picture collected by the camera is fused with the real-time enhancement information, and the fused real-time enhancement picture is displayed through the display area of the AR glasses, which includes the following steps:
S31: and determining a display area for displaying the live-action enhancement information in a real-time picture acquired by the camera, wherein the display area comprises a first display area for displaying the area virtual image and a second display area for displaying the flow virtual image.
In this step, a display area of the live-action enhancement information is determined. In particular, the location and shape of the display area is determined. The first display area may be located at a position of the target to-be-detected area in the real-time picture, and a shape of the first display area may be the same as a shape of the target to-be-detected area, or the first display area may be a closed type graphic such as a rectangle, a circle, and the like, where the position of the closed type graphic includes the target to-be-detected area.
The position of the second display area can be adjacent to the position of the first display area, so that the relevance between the area virtual image and the process virtual image is reflected, and the relevance between the target to-be-detected area and the quality inspection process information is further reflected. Optionally, the first display area and the second display area may be connected by an indication line, so as to further intuitively display the association. The second display area can be a regular graph such as a rectangle and a circle, the quality inspection flow information can be clearly displayed in practical application, and the practical shape and the dimension can be determined according to the specific content and the display form of the quality inspection flow information.
S32: and displaying the live-action enhanced real-time picture in a display area of the AR glasses, wherein the live-action enhanced real-time picture comprises the area virtual image positioned in the first display area and/or the flow virtual image positioned in the second display area.
In this step, the first display area and/or the second display area are displayed in the display area of the AR glasses, so that the quality inspector can see the virtual information in the real-world scene through the display area.
In practical application, quality inspection personnel can switch the content of showing through instruction control AR glasses's display area. For example, the first display area, the second display area, or both the first display area and the second display area are displayed in the control display area.
According to the scheme provided by the embodiment of the application, the quality inspection flow information can be reasonably displayed through the display area, the rationality of the information display position can be improved, the quality inspection personnel can intuitively and clearly see the area needing quality inspection and the flow information for executing the quality inspection on the area, and the rationality of the quality inspection flow and the quality inspection efficiency can be improved.
Based on the solution provided in the foregoing embodiment, optionally, as shown in fig. 4, in step S12, obtaining, through a pre-constructed communication link, quality inspection information matched with the appearance image from a network side server includes:
S41: and sending the appearance image to a network side server through a pre-constructed communication link, wherein the network side server is used for comparing the appearance image with a pre-stored virtual model of at least one device to be inspected and sending target quality inspection information of the device to be inspected, which is matched with the appearance image, to the AR glasses.
S42: and receiving the target quality inspection information fed back by the network side server.
In this example, the AR glasses send the appearance image shot by the camera to the network side server, and then the network side server performs image comparison, and the comparison result is fed back to the AR glasses. In practical application, AR glasses have the characteristics of portability, less calculation power resources and lower image processing efficiency. The method can effectively improve the image comparison speed and accuracy in a mode of sending to the network side server, so that the AR glasses can efficiently acquire the target quality inspection information corresponding to the appearance image, and the quality inspection efficiency of quality inspection personnel is improved.
Based on the solution provided in the foregoing embodiment, optionally, the target quality inspection information includes a real-time device parameter collected in real time from the device to be inspected by the network side server, where the real-time device parameter includes a parameter to be inspected included in quality inspection flow information of the target quality inspection information.
In this example, the device to be detected is in communication connection with the network side server, and the device to be detected can send its own device parameters to the network side server in real time by means of active pushing or passive acquisition. The network side server can correspondingly store the equipment to be detected and the corresponding real-time equipment parameters thereof, and the stored parameters can be used for auxiliary detection of the equipment to be detected and can also be used for fault diagnosis and maintenance.
In an application scene that quality inspection personnel carry out quality inspection on equipment through AR glasses, if a network side server can receive real-time equipment parameters of equipment to be inspected in real time, corresponding quality inspection flow information is determined according to the equipment to be inspected, the real-time equipment parameters are screened based on the quality inspection flow information, and the real-time equipment parameters related to the quality inspection flow information are determined to be the parameters to be inspected. And then, the parameters to be inspected and the quality inspection flow information of the equipment to be inspected are used as target quality inspection information to be fed back to the AR glasses.
According to the scheme provided by the embodiment of the application, based on the pre-constructed communication link, the network side server can send the real-time equipment parameters of the equipment to be inspected to the AR glasses, the AR glasses can directly display the parameters to be inspected of the equipment to be inspected according to actual requirements, and the parameters to be inspected can also be subjected to parameter pre-treatment, so that the parameters to be inspected are displayed to quality inspection personnel more reasonably. In the visual field of the quality inspection personnel, the quality inspection flow and real-time parameters of the equipment to be inspected can be visually seen, so that the quality inspection personnel can accurately and rapidly finish the quality inspection operation.
Based on the solution provided in the foregoing embodiment, optionally, as shown in fig. 5a, the solution provided in the embodiment of the present application further includes the following steps:
S51: and receiving a man-machine interaction instruction, wherein the man-machine interaction instruction comprises a voice instruction and/or an action instruction.
In practical applications, the AR glasses may receive audio information in the environment through the microphone device, and further perform processing on the audio information to extract voices related to the operation instruction therein, thereby obtaining voice instructions. The action instructions may be generated by the AR glasses through specific actions in the camera acquisition picture, such as gesture actions. Optionally, the AR glasses may be further connected to other somatosensory devices to obtain multiple action instructions of the quality inspector. For example, by more accurately acquiring the hand movements of the quality inspector through smart gloves, or by acquiring the limb movements of the quality inspector through devices worn on arms, legs or other positions.
Specifically, the action instruction may include a gesture instruction, where the gesture instruction may be implemented based on a man-machine interaction technology of gesture recognition. In this example, the AR glasses gesture recognition technology in the augmented reality platform adopts TOF (Time of Flight) technology, and the TOF technology obtains depth map information of the gesture by calculating the round trip time of the emitted infrared laser, and then connects the depth map information with the model to realize gesture recognition.
A gesture recognition scenario based on TOF technology in this example is shown in fig. 5 b. Light rays of an environmental object (namely a hand) are collected by the pixel array through the optical lens, gesture information is periodically collected and converted into data through the timer and the digital-analog converter, the gesture information is processed into depth map data through the processor and the memory, and then gesture recognition is carried out according to continuous depth map data. In practical application, the optical parameters of the light source can be adjusted according to the quality of gesture information acquired by the pixel array, and then the environment light of the environment object is controlled by the optical lens so as to optimize the quality of gesture information acquired by the pixel array.
Compared with two technical schemes of structured light and multi-camera imaging, the TOF technology is more flexible in design, can realize the range adjustment of 3D imaging by changing the intensity of a light source, and can realize the identification requirements under different application environments by adjusting the pulse frequency of the light source. In terms of calculation, TOF technology is the most convenient and feasible in gesture recognition, does not need to perform calculation in the aspect of computer vision, has the advantages of faster refresh speed and high scanning precision, and can effectively improve the experience effect of AR glasses.
In addition to gesture instructions, the operation control of the AR glasses can also be realized through voice instructions. The human-computer interaction technology based on voice recognition can enable the AR glasses to convert voice instructions uttered by a user into specific computer instructions through understanding and classifying functions.
FIG. 5c shows a schematic flow chart for recognizing speech instructions based on a pre-trained semantic recognition model. In this example, the user's operational intent may be identified by way of a pre-trained semantic identification model. For example, a language model may be trained based on a language database and an acoustic model may be trained based on a speech database. In the training process, the voice and the language are associated based on the voice and linguistic knowledge, and model training quality is optimized through various modes such as a data mining technology, a signal processing technology and a statistical modeling method, so that the trained semantic recognition model can output ideas of the user language, and a voice result is generated. In the application process, preprocessing steps such as endpoint control, noise reduction, feature extraction and the like are carried out on digital voice signals collected by the radio equipment based on the front-end module, decoding is carried out through the pre-training semantic recognition model, and a voice instruction is generated according to the output result of the model.
The AR glasses can be further provided with an eye movement module for collecting the action of the pupils of the eyes of the quality inspector, and the action of the pupils of the eyes can be used for assisting in determining equipment to be inspected. Optionally, the direction and angle of the head movement are obtained through the gyroscope of the AR glasses, and the visual field range of the quality inspector is determined by the direction and angle of the head movement. And acquiring the pixel coordinates of pupils of the two eyes and the pixel coordinates of the view point on the display screen of the AR glasses through the eye movement module of the AR glasses, and converting the pixel coordinates into coordinates in a real world coordinate system. The pixel coordinates of the pupils of the two eyes in the world coordinate system and the pixel coordinates of the view points on the display screen determine the viewing vectors of the two eyes, and the intersection point of the viewing vectors of the two eyes is the viewing target in the range. The field of view of the inspector designates the sighting target as the primary inspection device.
Optionally, the motion instruction may further include head motions such as nodding and waving detected by the gesture sensing module of the gyroscope of the AR glasses. The nodding motion may correspond to a "yes" instruction and the nodding motion may correspond to a "no" instruction.
Optionally, the AR glasses may also interact with the quality inspector by way of an inquiry. For example, the AR glasses primarily identify optional devices to be inspected in the screen based on head movements and binocular pupil movements of the quality inspector, thereby querying the quality inspector whether the quality inspector needs to confirm the main inspection device. And detecting whether the head of the inspection personnel moves or not by using a gyroscope in a preset time period. If yes, confirming the main checking device, and if not, inquiring again; and when the number of requests exceeds a certain number, the equipment to be detected is redetermined.
S52: and executing the operation corresponding to the man-machine interaction instruction, wherein the operation corresponding to the man-machine interaction instruction comprises the steps of collecting an appearance image through the camera and/or switching the content displayed in the display area.
In practical applications, the corresponding relationship between the man-machine interaction instruction and the operation may be preset, for example, the corresponding relationship between the gesture and the AR glasses display operation is set, so that the content displayed in the AR glasses display area is flexibly and conveniently controlled through the gesture operation.
Fig. 5d shows a schematic diagram of an interaction flow based on man-machine interaction instructions. The interactive flow can be realized by an enhanced display system, a man-machine interaction system and an intelligent processing system.
The enhanced display system can be composed of a three-dimensional mapping module, an identification database and a live-action enhancement module and is used for detecting and identifying scene states and positioning scene information in real time, transmitting the information to the intelligent processing system and fusing the enhanced information of the intelligent processing system into a live-action scene for display.
The man-machine interaction system consists of a signal recognition module, an API (Application Programming Interface, a program interface) library, a command processing module and a command database, and is used for recognizing and acquiring commands such as eye gaze, gestures, sounds and the like input by a user, and specific digital information is obtained by comparing the API database, and the command processing module converts the digital information into specific commands and transmits the specific commands to the intelligent processing system.
The intelligent processing system consists of an intelligent control module, a guiding and retrieving module and a script animation database, and is used for receiving command data transmitted from outside, managing and summarizing the command data through script data logic in the database and outputting enhanced information.
Based on the scheme provided by the embodiment of the application, the AR glasses can acquire various instructions of the user, and the display picture of the AR glasses is enhanced according to the intention of the user in a mode of analyzing and interacting the instructions, so that quality inspectors can flexibly execute interaction operation according to actual requirements.
According to the scheme provided by the application, quality inspection personnel can interact with the AR glasses in a mode of combining various modes of voice and action, so that the AR glasses are flexibly and efficiently controlled to execute quality inspection related operations, and the quality inspection efficiency of equipment is effectively improved.
Optionally, a positioning module may be further provided in the AR glasses, and the position of the quality inspection person wearing the AR glasses is obtained in real time through the positioning module of the AR glasses, and a quality inspection record is generated based on multiple information such as a moving track of the quality inspection person, a quality inspection interaction action, a quality inspection result feedback, and the quality inspection record may be used for equipment fault analysis and maintenance.
In order to solve the problems in the prior art, the present embodiment further provides an apparatus quality inspection device 60 based on AR glasses, as shown in fig. 6, including:
the acquisition module 61 acquires an appearance image of the equipment to be inspected through a camera of the AR glasses;
The obtaining module 62 obtains quality inspection information matched with the appearance image from a network side through a pre-constructed communication link, wherein the quality inspection information comprises a virtual model of the equipment to be inspected and quality inspection flow information corresponding to at least one area to be inspected in the virtual model;
The determining module 63 is used for comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture;
And the display module 64 displays the quality inspection flow information corresponding to the target to-be-inspected area in the display area of the AR glasses.
The device provided by the embodiment of the application firstly collects the appearance image of the equipment to be inspected through the camera of the AR glasses; then, quality inspection information matched with the appearance image is obtained from a network side server through a pre-constructed communication link, the network side stores quality inspection information of at least one device, and the quality inspection information comprises a virtual model of the device to be inspected and quality inspection flow information corresponding to at least one region to be inspected in the virtual model; then, comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture; and finally, displaying quality inspection flow information corresponding to the target to-be-inspected area in the display area of the AR glasses. The quality inspection flow information is displayed through the display area, so that quality inspection personnel can intuitively see the quality inspection flow information corresponding to the to-be-inspected area of the to-be-inspected equipment in the display area after wearing the AR glasses. In the process that quality inspection personnel execute quality inspection, virtual models of equipment to be inspected and quality inspection flow information corresponding to the areas to be inspected can be obtained efficiently through the AR glasses, wherein the virtual models can accurately position target areas to be inspected, and accordingly the matching degree of the displayed quality inspection flow information and the areas to be inspected in the sight range of the quality inspection personnel is improved, and the accuracy of the quality inspection flow is improved. Quality control flow information through the display of AR glasses can be convenient for quality control personnel standardized quality control operation, improves quality control efficiency.
In order to solve the technical problems in the prior art, the embodiment of the present application further provides an equipment quality inspection system based on AR glasses, as shown in fig. 7, including:
AR glasses-based device quality inspection apparatus 71 as in any of the examples above;
a network side server 72 communicatively coupled to the AR glasses-based device quality inspection apparatus via a pre-constructed communication link.
The system provided by the embodiment of the application firstly, the appearance image of the equipment to be inspected is acquired through the camera of the AR glasses; then, quality inspection information matched with the appearance image is obtained from a network side server through a pre-constructed communication link, the network side stores quality inspection information of at least one device, and the quality inspection information comprises a virtual model of the device to be inspected and quality inspection flow information corresponding to at least one region to be inspected in the virtual model; then, comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture; and finally, displaying quality inspection flow information corresponding to the target to-be-inspected area in the display area of the AR glasses. The quality inspection flow information is displayed through the display area, so that quality inspection personnel can intuitively see the quality inspection flow information corresponding to the to-be-inspected area of the to-be-inspected equipment in the display area after wearing the AR glasses. In the process that quality inspection personnel execute quality inspection, virtual models of equipment to be inspected and quality inspection flow information corresponding to the areas to be inspected can be obtained efficiently through the AR glasses, wherein the virtual models can accurately position target areas to be inspected, and accordingly the matching degree of the displayed quality inspection flow information and the areas to be inspected in the sight range of the quality inspection personnel is improved, and the accuracy of the quality inspection flow is improved. Quality control flow information through the display of AR glasses can be convenient for quality control personnel standardized quality control operation, improves quality control efficiency.
Alternatively, the number of AR glasses-based device quality inspection apparatuses connected to the network side server may be plural, as shown by the dotted line box in fig. 7. The network side server can pre-deploy the quality inspection task and realize the issuing of the quality inspection task in a mode of issuing the quality inspection task to the AR glasses. In an application scenario that the AR glasses have a plurality of, each AR glasses can log in to a network side server through an account, and the network side server issues tasks to each AR glasses account based on a distributed deployment mode. Quality inspection personnel can execute quality inspection according to quality inspection tasks displayed in the AR glasses, so that a plurality of quality inspection personnel can be distributed and cooperated to execute the quality inspection tasks, and missing inspection or re-inspection is avoided.
The above modules in the apparatus provided by the embodiment of the present application may further implement the method steps provided by the foregoing method embodiment. Or the device provided by the embodiment of the application may further include other modules besides the above modules, so as to implement the method steps provided by the embodiment of the method. The device provided by the embodiment of the application can realize the technical effects achieved by the embodiment of the method.
Preferably, the embodiment of the present application further provides an electronic device, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor, where the computer program when executed by the processor implements each process of the above embodiment of the device quality inspection method based on AR glasses, and the process can achieve the same technical effect, so that repetition is avoided, and no further description is given here.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the above embodiment of the AR glasses-based device quality inspection method, and can achieve the same technical effects, so that repetition is avoided, and no further description is given here. The computer readable storage medium is, for example, a Read-Only Memory (ROM), a random access Memory (Random Access Memory RAM), a magnetic disk or an optical disk.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.
Claims (10)
1. An AR glasses-based device quality inspection method, which is applied to augmented reality AR glasses, comprising:
the method comprises the steps of collecting appearance images of equipment to be inspected through cameras of AR glasses;
Acquiring quality inspection information matched with the appearance image from a network side server through a pre-constructed communication link, wherein the network side server stores quality inspection information of at least one device, and the quality inspection information comprises a virtual model of the device to be inspected and quality inspection flow information corresponding to at least one region to be inspected in the virtual model;
Comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture;
And displaying quality inspection flow information corresponding to the target to-be-inspected area in a display area of the AR glasses.
2. The method of claim 1, further comprising, before the displaying area of the AR glasses displays the quality inspection flow information corresponding to the target area to be inspected:
generating live-action enhancement information based on quality inspection flow information corresponding to the target to-be-inspected area, wherein the live-action enhancement information comprises an area virtual image of the target to-be-inspected area and a flow virtual image of a corresponding quality inspection flow;
Displaying quality inspection flow information corresponding to the target to-be-inspected area in a display area of the AR glasses comprises:
and fusing the real-time picture acquired by the camera with the real-time enhancement information, and displaying the fused real-time picture through the display area of the AR glasses.
3. The method of claim 2, wherein fusing the real-time picture acquired by the camera with the real-scene enhancement information and displaying the fused real-scene enhancement real-time picture through the display area of the AR glasses, comprises:
determining a display area for displaying the live-action enhancement information in a real-time picture acquired by the camera, wherein the display area comprises a first display area for displaying the area virtual image and a second display area for displaying the process virtual image;
and displaying the live-action enhanced real-time picture in a display area of the AR glasses, wherein the live-action enhanced real-time picture comprises the area virtual image positioned in the first display area and/or the flow virtual image positioned in the second display area.
4. The method of claim 1, wherein obtaining quality inspection information matching the appearance image from a network side server over a pre-constructed communication link comprises:
The appearance image is sent to a network side server through a pre-constructed communication link, wherein the network side server is used for comparing the appearance image with a pre-stored virtual model of at least one device to be detected and sending target quality inspection information of the device to be detected, which is matched with the appearance image, to an AR (augmented reality) glasses;
and receiving the target quality inspection information fed back by the network side server.
5. The method of claim 4, wherein the target quality inspection information includes real-time device parameters collected from the device under inspection in real-time by the network side server, the real-time device parameters including parameters under inspection contained in quality inspection flow information of the target quality inspection information.
6. The method of any one of claims 1-5, further comprising:
receiving a man-machine interaction instruction, wherein the man-machine interaction instruction comprises a voice instruction and/or an action instruction;
and executing the operation corresponding to the man-machine interaction instruction, wherein the operation corresponding to the man-machine interaction instruction comprises the steps of collecting an appearance image through the camera and/or switching the content displayed in the display area.
7. An AR glasses-based equipment quality inspection device, comprising:
the acquisition module acquires an appearance image of the equipment to be inspected through a camera of the AR glasses;
The acquisition module acquires quality inspection information matched with the appearance image from a network side through a pre-constructed communication link, wherein the quality inspection information comprises a virtual model of the equipment to be inspected and quality inspection flow information corresponding to at least one area to be inspected in the virtual model;
The determining module is used for comparing the real-time picture acquired by the camera with the virtual model to determine a target to-be-detected area contained in the real-time picture;
And the display module displays the quality inspection flow information corresponding to the target to-be-inspected area in the display area of the AR glasses.
8. An AR glasses-based device quality inspection system, comprising:
the AR-glasses-based device quality inspection apparatus of claim 7;
And the network side server is in communication connection with the equipment quality inspection device based on the AR glasses through a pre-built communication link.
9. An electronic device, comprising: memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor, performs the steps of the method according to any one of claims 1 to 6.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, implements the steps of the method according to any one of claims 1 to 6.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202410141224.8A CN118131904A (en) | 2024-01-31 | 2024-01-31 | Equipment quality inspection method, device and system based on AR (augmented reality) glasses |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202410141224.8A CN118131904A (en) | 2024-01-31 | 2024-01-31 | Equipment quality inspection method, device and system based on AR (augmented reality) glasses |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN118131904A true CN118131904A (en) | 2024-06-04 |
Family
ID=91239349
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202410141224.8A Pending CN118131904A (en) | 2024-01-31 | 2024-01-31 | Equipment quality inspection method, device and system based on AR (augmented reality) glasses |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN118131904A (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120508432A (en) * | 2025-07-18 | 2025-08-19 | 苏州元脑智能科技有限公司 | Fault handling method, electronic device, wearable device, and storage medium |
-
2024
- 2024-01-31 CN CN202410141224.8A patent/CN118131904A/en active Pending
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN120508432A (en) * | 2025-07-18 | 2025-08-19 | 苏州元脑智能科技有限公司 | Fault handling method, electronic device, wearable device, and storage medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11954904B2 (en) | Real-time gesture recognition method and apparatus | |
| US20200387697A1 (en) | Real-time gesture recognition method and apparatus | |
| CN106033601B (en) | The method and apparatus for detecting abnormal case | |
| Yin et al. | Synchronous AR assembly assistance and monitoring system based on ego-centric vision | |
| EP3665676A1 (en) | Speaking classification using audio-visual data | |
| CN107004279A (en) | Natural user interface camera calibrated | |
| CA2913541A1 (en) | Smart prosthesis for facilitating artificial vision using scene abstraction | |
| US20200357177A1 (en) | Apparatus and method for generating point cloud data | |
| KR20220011078A (en) | Active interaction method, device, electronic equipment and readable storage medium | |
| Schütt et al. | Semantic interaction in augmented reality environments for microsoft hololens | |
| Núnez et al. | Real-time human body tracking based on data fusion from multiple RGB-D sensors | |
| CN110211222A (en) | A kind of AR immersion tourism guide method, device, storage medium and terminal device | |
| Mesbahi et al. | Hand gesture recognition based on various deep learning YOLO models | |
| CN118131904A (en) | Equipment quality inspection method, device and system based on AR (augmented reality) glasses | |
| Cherkasov et al. | The use of open and machine vision technologies for development of gesture recognition intelligent systems | |
| CN119722998A (en) | Training method and system based on multimodal IoT perception and virtual-real symbiosis | |
| Mazzamuto et al. | A Wearable Device Application for Human-Object Interactions Detection. | |
| Vidhate et al. | Virtual paint application by hand gesture recognition system | |
| Yang et al. | Towards generic 3d tracking in RGBD videos: Benchmark and baseline | |
| KR20230053262A (en) | A 3D object recognition method based on a 2D real space image and a computer program recorded on a recording medium to execute the same | |
| CN116844133B (en) | Target detection method, device, electronic equipment and medium | |
| CN117435055A (en) | Gesture-enhanced eye tracking human-computer interaction method based on spatial stereoscopic display | |
| CN116841391A (en) | Digital human interaction control method, device, electronic equipment and storage medium | |
| Wu et al. | 3d semantic vslam of dynamic environment based on yolact | |
| Deligiannakis et al. | Mixed Reality with Hardware Acceleration: implementing the multimodal user interface vision |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |