[go: up one dir, main page]

CN114419469A - Target identification method and device, AR device and readable storage medium - Google Patents

Target identification method and device, AR device and readable storage medium Download PDF

Info

Publication number
CN114419469A
CN114419469A CN202111646423.7A CN202111646423A CN114419469A CN 114419469 A CN114419469 A CN 114419469A CN 202111646423 A CN202111646423 A CN 202111646423A CN 114419469 A CN114419469 A CN 114419469A
Authority
CN
China
Prior art keywords
image data
target
dimensional image
recognition
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111646423.7A
Other languages
Chinese (zh)
Inventor
王奥博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Optical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Optical Technology Co Ltd filed Critical Goertek Optical Technology Co Ltd
Priority to CN202111646423.7A priority Critical patent/CN114419469A/en
Publication of CN114419469A publication Critical patent/CN114419469A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Studio Devices (AREA)

Abstract

本公开提供了一种目标识别方法、装置、AR设备及可读存储介质。该方法包括:在接收到目标识别任务的情况下,获取环境光照强度;在所述环境光照强度小于预设值的情况下,获取目标对象的二维图像数据以及深度图像数据;基于所述二维图像数据以及深度图像数据得到所述目标识别任务的识别结果。

Figure 202111646423

The present disclosure provides a target recognition method, apparatus, AR device and readable storage medium. The method includes: when a target recognition task is received, obtaining ambient light intensity; when the ambient light intensity is less than a preset value, obtaining two-dimensional image data and depth image data of a target object; based on the two The dimensional image data and the depth image data are used to obtain the recognition result of the target recognition task.

Figure 202111646423

Description

目标识别方法、装置、AR设备及可读存储介质Target recognition method, device, AR device and readable storage medium

技术领域technical field

本公开实施例涉及识别技术领域,更具体地,涉及一种目标识别方法、装置、AR设备及可读存储介质。Embodiments of the present disclosure relate to the field of identification technologies, and more particularly, to a target identification method, apparatus, AR device, and readable storage medium.

背景技术Background technique

增强现实(Augmented Reality,AR)设备可以采用AR技术将计算机生成的文字、图像、三维模型、多媒体等虚拟信息模拟仿真后,映射到真实世界中,实现虚拟信息和真实世界的融合。Augmented Reality (AR) devices can use AR technology to simulate and simulate virtual information such as text, images, 3D models, and multimedia generated by computers, and then map them to the real world to realize the fusion of virtual information and the real world.

在AR设备中通常设置有用于采集图像的RGB相机,RGB相机能够获取物体的二维图像信息,处理器通过特定的算法对二维图像信息进行计算来实现目标识别。An RGB camera is usually installed in an AR device for capturing images. The RGB camera can obtain two-dimensional image information of an object, and the processor calculates the two-dimensional image information through a specific algorithm to realize target recognition.

但是,在弱光环境下,RGB相机采集到的二维图像信息的信息量会减少,这会导致目标识别率降低。因此,有必要提出一种新的目标识别方法。However, in a low-light environment, the information amount of the two-dimensional image information collected by the RGB camera will be reduced, which will lead to a reduction in the target recognition rate. Therefore, it is necessary to propose a new object recognition method.

发明内容SUMMARY OF THE INVENTION

本公开实施例的一个目的是提供一种目标识别方法,以提高目标识别率。An object of the embodiments of the present disclosure is to provide a target recognition method, so as to improve the target recognition rate.

根据本公开实施例的第一方面,提供了一种目标识别方法,应用于AR设备,所述方法包括:According to a first aspect of the embodiments of the present disclosure, there is provided a target recognition method, which is applied to an AR device, and the method includes:

在接收到目标识别任务的情况下,获取环境光照强度;In the case of receiving the target recognition task, obtain the ambient light intensity;

在所述环境光照强度小于预设值的情况下,获取目标对象的二维图像数据以及深度图像数据;Obtaining two-dimensional image data and depth image data of the target object when the ambient light intensity is less than a preset value;

基于所述二维图像数据以及深度图像数据得到所述目标识别任务的识别结果。The recognition result of the target recognition task is obtained based on the two-dimensional image data and the depth image data.

可选地,所述获取环境光照强度之后,所述方法还包括:Optionally, after obtaining the ambient light intensity, the method further includes:

在所述环境光照强度大于所述预设值的情况下,获取所述目标对象的二维图像数据;When the ambient light intensity is greater than the preset value, acquiring two-dimensional image data of the target object;

基于所述二维图像数据得到所述目标识别任务的识别结果。The recognition result of the target recognition task is obtained based on the two-dimensional image data.

可选地,所述基于所述二维图像数据以及深度图像数据得到所述目标识别任务的识别结果,包括:Optionally, obtaining the recognition result of the target recognition task based on the two-dimensional image data and the depth image data, including:

基于第一目标识别算法对所述二维图像数据以及深度图像数据进行处理,得到所述目标识别任务的识别结果。The two-dimensional image data and the depth image data are processed based on the first target recognition algorithm to obtain a recognition result of the target recognition task.

可选地,基于所述二维图像数据得到所述目标识别任务的识别结果,包括:Optionally, obtaining the recognition result of the target recognition task based on the two-dimensional image data, including:

基于第二目标识别算法对所述二维图像数据进行处理,得到所述目标识别任务的识别结果。The two-dimensional image data is processed based on the second target recognition algorithm to obtain a recognition result of the target recognition task.

根据本公开实施例的第二方面,还提供一种目标识别装置,设置于AR设备,所述装置包括:According to a second aspect of the embodiments of the present disclosure, there is also provided a target recognition device, which is arranged in an AR device, and the device includes:

获取模块,用于在接收到目标识别任务的情况下,获取环境光照强度;在所述环境光照强度小于预设值的情况下,获取目标对象的二维图像数据以及深度图像数据;an acquisition module, configured to acquire the ambient light intensity when the target recognition task is received; when the ambient light intensity is less than a preset value, acquire two-dimensional image data and depth image data of the target object;

识别模块,用于基于所述二维图像数据以及深度图像数据得到所述目标识别任务的识别结果。A recognition module, configured to obtain a recognition result of the target recognition task based on the two-dimensional image data and the depth image data.

可选地,所述获取模块还用于:Optionally, the obtaining module is also used for:

在所述环境光照强度大于所述预设值的情况下,获取所述目标对象的二维图像数据;When the ambient light intensity is greater than the preset value, acquiring two-dimensional image data of the target object;

所述识别模块还用于:基于所述二维图像数据得到所述目标识别任务的识别结果。The recognition module is further configured to obtain a recognition result of the target recognition task based on the two-dimensional image data.

可选地,所述识别模块具体用于:Optionally, the identification module is specifically used for:

基于第一目标识别算法对所述二维图像数据以及深度图像数据进行处理,得到所述目标识别任务的识别结果。The two-dimensional image data and the depth image data are processed based on the first target recognition algorithm to obtain a recognition result of the target recognition task.

可选地,所述识别模块具体用于:Optionally, the identification module is specifically used for:

基于第二目标识别算法对所述二维图像数据进行处理,得到所述目标识别任务的识别结果。The two-dimensional image data is processed based on the second target recognition algorithm to obtain a recognition result of the target recognition task.

根据本公开实施例的第三方面,还提供一种AR设备,包括环境光传感器,RGB相机以及飞行时间传感器,所述AR设备还包括:According to a third aspect of the embodiments of the present disclosure, an AR device is further provided, including an ambient light sensor, an RGB camera, and a time-of-flight sensor, and the AR device further includes:

存储器,用于存储可执行的计算机指令;memory for storing executable computer instructions;

处理器,用于根据所述可执行的计算机指令的控制,执行根据本公开实施例的第一方面中任意一项所述的目标识别方法;a processor, configured to execute the target identification method according to any one of the first aspects of the embodiments of the present disclosure according to the control of the executable computer instructions;

其中,所述环境光传感器、所述RGB相机以及所述飞行时间传感器均与所述处理器连接。Wherein, the ambient light sensor, the RGB camera and the time-of-flight sensor are all connected to the processor.

根据本公开实施例的第四方面,还提供一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如本公开实施例的第一方面中任一项所述的目标识别方法的步骤。According to a fourth aspect of the embodiments of the present disclosure, there is also provided a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the first embodiment of the present disclosure is implemented The steps of the target recognition method of any one of the aspects.

本公开实施例的一个有益效果在于,可以在接收到目标识别任务的情况下,获取环境光照强度;在所述环境光照强度小于预设值的情况下,获取目标对象的二维图像数据以及深度图像数据;基于所述二维图像数据以及深度图像数据得到所述目标识别任务的识别结果。这样,可以在环境光照强度小于预设值的情况下,基于二维图像数据和深度图像数据识别出目标对象,从而可以避免由于在光照强度较低的情况下采集到的目标对象的信息量较少而导致的目标识别率降低的情况,显著提升了目标识别率。One beneficial effect of the embodiments of the present disclosure is that the ambient light intensity can be acquired when the target recognition task is received; and the two-dimensional image data and depth of the target object can be acquired when the ambient light intensity is less than a preset value. image data; obtaining the recognition result of the target recognition task based on the two-dimensional image data and the depth image data. In this way, the target object can be identified based on the two-dimensional image data and the depth image data when the ambient light intensity is less than the preset value, so as to avoid the fact that the amount of information of the target object collected when the light intensity is low is relatively high. The target recognition rate is reduced due to less, and the target recognition rate is significantly improved.

通过以下参照附图对本说明书的示例性实施例的详细描述,本说明书的其它特征及其优点将会变得清楚。Other features and advantages of the present specification will become apparent from the following detailed description of exemplary embodiments of the present specification with reference to the accompanying drawings.

附图说明Description of drawings

被结合在说明书中并构成说明书的一部分的附图示出了本说明书的实施例,并且连同其说明一起用于解释本说明书的原理。The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of this specification and, together with the description, serve to explain the principles of this specification.

图1是根据本公开实施例的AR设备的硬件配置的框图;1 is a block diagram of a hardware configuration of an AR device according to an embodiment of the present disclosure;

图2示出了本公开的一个实施例的目标识别方法的示意性流程图;FIG. 2 shows a schematic flowchart of a target recognition method according to an embodiment of the present disclosure;

图3是根据本公开实施例的识别模型的搭建过程的示意性流程图;FIG. 3 is a schematic flowchart of a building process of a recognition model according to an embodiment of the present disclosure;

图4是根据本公开实施例的目标识别方法所使用的识别模型的卷积神经网络的示意图;4 is a schematic diagram of a convolutional neural network of a recognition model used by the target recognition method according to an embodiment of the present disclosure;

图5根据本公开实施例的目标识别装置的原理框图;5 is a schematic block diagram of a target identification device according to an embodiment of the present disclosure;

图6是根据一个实施例的AR设备的硬件结构示意图。FIG. 6 is a schematic diagram of a hardware structure of an AR device according to an embodiment.

具体实施方式Detailed ways

现在将参照附图来详细描述本公开的各种示例性实施例。应注意到:除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置、数字表达式和数值不限制本公开实施例的范围。Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that the relative arrangements of components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the embodiments of the present disclosure unless specifically stated otherwise.

以下对至少一个示例性实施例的描述实际上仅仅是说明性的,决不作为对本公开及其应用或使用的任何限制。The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application or uses in any way.

对于相关领域普通技术人员已知的技术、方法和设备可能不作详细讨论,但在适当情况下,所述技术、方法和设备应当被视为说明书的一部分。。Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail, but where appropriate, such techniques, methods, and apparatus should be considered part of the specification. .

在这里示出和讨论的所有例子中,任何具体值应被解释为仅仅是示例性的,而不是作为限制。因此,示例性实施例的其它例子可以具有不同的值。In all examples shown and discussed herein, any specific values should be construed as illustrative only and not limiting. Accordingly, other instances of the exemplary embodiment may have different values.

应注意到:相似的标号和字母在下面的附图中表示类似项,因此,一旦某一项在一个附图中被定义,则在随后的附图中不需要对其进行进一步讨论。It should be noted that like numerals and letters refer to like items in the following figures, so once an item is defined in one figure, it does not require further discussion in subsequent figures.

<硬件配置><Hardware configuration>

图1是根据本公开实施例的AR设备1000的硬件配置的框图。FIG. 1 is a block diagram of a hardware configuration of an AR device 1000 according to an embodiment of the present disclosure.

如图1所示,AR设备1000例如可以是AR眼镜等,本公开实施例对此不作限定。在一个实施例中,如图1所示,AR设备1000可以包括处理器1100、存储器1200、接口装置1300、通信装置1400、显示装置1500、输入装置1600、麦克风1700和扬声器1800。As shown in FIG. 1 , the AR device 1000 may be, for example, AR glasses, etc., which is not limited in this embodiment of the present disclosure. In one embodiment, as shown in FIG. 1 , the AR device 1000 may include a processor 1100 , a memory 1200 , an interface device 1300 , a communication device 1400 , a display device 1500 , an input device 1600 , a microphone 1700 and a speaker 1800 .

处理器1100可以包括但不限于中央处理器CPU、微处理器MCU等。存储器1200例如包括ROM(只读存储器)、RAM(随机存取存储器)、诸如硬盘的非易失性存储器等。接口装置1300例如包括各种总线接口,例如串行总线接口(包括USB接口)、并行总线接口等。通信装置1400例如能够进行有线或无线通信。显示装置1500例如是液晶显示屏、LED显示屏、触摸显示屏等。输入装置1600例如包括触摸屏等。麦克风1700可以用于输入语音信息。扬声器1800可以用于输出语音信息。The processor 1100 may include, but is not limited to, a central processing unit CPU, a microprocessor MCU, and the like. The memory 1200 includes, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), a nonvolatile memory such as a hard disk, and the like. The interface device 1300 includes, for example, various bus interfaces, such as a serial bus interface (including a USB interface), a parallel bus interface, and the like. The communication device 1400 is capable of, for example, wired or wireless communication. The display device 1500 is, for example, a liquid crystal display, an LED display, a touch display, or the like. The input device 1600 includes, for example, a touch screen or the like. The microphone 1700 may be used to input voice information. The speaker 1800 may be used to output voice information.

本实施例中,AR设备1000的存储器1200用于存储指令,该指令用于控制处理器1100进行操作以实施或者支持实施根据任意实施例的目标识别方法。技术人员可以根据本说明书所公开方案设计指令。指令如何控制处理器进行操作,这是本领域公知,故在此不再详细描述。In this embodiment, the memory 1200 of the AR device 1000 is used to store instructions for controlling the processor 1100 to operate to implement or support the implementation of the target recognition method according to any embodiment. A skilled person can design instructions according to the solutions disclosed in this specification. How the instruction controls the processor to operate is well known in the art, so it will not be described in detail here.

本领域技术人员应当理解,尽管在图1中示出了AR设备1000的多个装置,但是,本公开实施例的AR设备1000可以仅涉及其中的部分装置,也可以还包含其他装置,在此不做限定。Those skilled in the art should understand that although a plurality of devices of the AR device 1000 are shown in FIG. 1 , the AR device 1000 in this embodiment of the present disclosure may only involve some of the devices, or may also include other devices. Not limited.

<方法实施例><Method Example>

图2示出了本公开的一个实施例的目标识别方法的示意性流程图,该目标识别方法例如可以由如图1所示的AR设备实施。FIG. 2 shows a schematic flowchart of a target recognition method according to an embodiment of the present disclosure, and the target recognition method can be implemented by, for example, the AR device shown in FIG. 1 .

如图2所示,该实施例提供的目标识别方法可以包括以下步骤2100~2300。As shown in FIG. 2 , the target identification method provided by this embodiment may include the following steps 2100 to 2300 .

步骤2100,在接收到目标识别任务的情况下,获取环境光照强度。Step 2100: Obtain the ambient light intensity when the target recognition task is received.

本步骤中,环境光照强度可以由设置在所述AR设备中的环境光传感器获取。In this step, the ambient light intensity may be acquired by an ambient light sensor provided in the AR device.

步骤2200,在所述环境光照强度小于预设值的情况下,获取目标对象的二维图像数据以及深度图像数据。Step 2200: Obtain two-dimensional image data and depth image data of the target object when the ambient light intensity is less than a preset value.

需要说明的是,所述预设值是AR设备处于某光照强度下,仅基于二维图像数据对目标对象进行识别的识别率低于基于二维图像数据和深度图像数据的识别率时对应的光照强度值。It should be noted that the preset value is the corresponding value when the AR device is under a certain illumination intensity, and the recognition rate of the target object recognition based only on the two-dimensional image data is lower than the recognition rate based on the two-dimensional image data and the depth image data. Light intensity value.

本步骤中,在获取到所述环境光照强度后,可以将所获取到的所述环境光照强度与预设值进行比较。若所获取到的所述环境光照强度小于预设值,则获取所述目标对象的二维图像数据以及所述深度图像数据。以提高在弱光环境下的目标识别率。In this step, after the ambient light intensity is acquired, the acquired ambient light intensity may be compared with a preset value. If the acquired ambient light intensity is less than a preset value, acquire the two-dimensional image data of the target object and the depth image data. In order to improve the target recognition rate in low light environment.

可以理解的是,由于飞行时间传感器的测量原理是通过发射特定范围波长的红外光,经物体反射后一段时间再由传感器接收,根据光的发射和被物体反射后返回传感器之间的时间差或相位差,通过转换为距离数据,即可获取物体的图像深度信息。因此,本实施例中,所述深度图像数据可以是通过设置在所述AR设备中的飞行时间传感器获取的,所述二维图像数据可以是通过设置在所述AR设备中的RGB相机获取的。It can be understood that, because the measurement principle of the time-of-flight sensor is to emit infrared light of a specific range of wavelengths, after being reflected by the object for a period of time, it is received by the sensor, according to the time difference or phase between the light emission and the return to the sensor after being reflected by the object. difference, by converting to distance data, the image depth information of the object can be obtained. Therefore, in this embodiment, the depth image data may be acquired by a time-of-flight sensor provided in the AR device, and the two-dimensional image data may be acquired by an RGB camera provided in the AR device .

可以理解的是,在获取到所述环境光照强度后,若所获取到的所述环境光照强度大于预设值,则仅需获取所述目标对象的二维图像数据,而不需要控制飞行时间传感器去获取目标对象的深度数据,这样,可以在保证目标对象的识别率的情况下,尽量减少AR设备的设备功耗。It can be understood that, after obtaining the ambient light intensity, if the obtained ambient light intensity is greater than the preset value, only the two-dimensional image data of the target object needs to be obtained without controlling the flight time. The sensor obtains the depth data of the target object, so that the device power consumption of the AR device can be reduced as much as possible while ensuring the recognition rate of the target object.

步骤2300,基于所述二维图像数据以及深度图像数据得到所述目标识别任务的识别结果。Step 2300: Obtain a recognition result of the target recognition task based on the two-dimensional image data and the depth image data.

具体的,在所述环境光照强度小于预设值的情况下,可以确定执行所述目标识别任务时采用第一目标识别算法;在所述环境光照强度大于所述预设值的情况下,确定执行所述目标识别任务时采用第二目标识别算法。Specifically, when the ambient light intensity is less than a preset value, it may be determined to use the first target recognition algorithm when performing the target recognition task; when the ambient light intensity is greater than the preset value, it may be determined A second target recognition algorithm is used when performing the target recognition task.

也就是说,在基于所述二维图像数据以及深度图像数据得到所述目标识别任务的识别结果时,具体可以基于第一目标识别算法对所述二维图像数据以及深度图像数据进行处理,得到所述目标识别任务的识别结果。在基于所述而为图像数据得到所述目标识别任务的识别结果时,具体可以基于第二目标识别算法对所述二维图像数据进行处理,得到所述目标识别任务的识别结果。That is to say, when the recognition result of the target recognition task is obtained based on the two-dimensional image data and the depth image data, the two-dimensional image data and the depth image data may be processed specifically based on the first target recognition algorithm to obtain The recognition result of the target recognition task. When obtaining the recognition result of the target recognition task based on the image data, specifically, the two-dimensional image data may be processed based on the second target recognition algorithm to obtain the recognition result of the target recognition task.

在此需要说明的是,如图3所示,所述第一目标识别算法和所述第二目标识别算法均是基于识别模型实现的,所述识别模型的搭建过程可以参照如下步骤3100~步骤3300:It should be noted here that, as shown in FIG. 3 , both the first target recognition algorithm and the second target recognition algorithm are implemented based on the recognition model, and the construction process of the recognition model may refer to the following steps 3100 to 3100 3300:

步骤3100,基于输入图像数据与正确的目标识别结果的映射关系构造训练数据集。Step 3100: Construct a training data set based on the mapping relationship between the input image data and the correct target recognition result.

步骤3200,通过深度学习算法对训练数据集中大量的训练数据进行学习,输出识别结果。Step 3200: Learning a large amount of training data in the training data set through a deep learning algorithm, and outputting a recognition result.

步骤3300,将识别结果和对应的目标识别结果进行比较,在识别结果和目标识别结果存在误差的情况下,调整深度学习算法中的参数减小目标识别的误差,提高目标识别算法的准确率。Step 3300: Compare the recognition result with the corresponding target recognition result. If there is an error between the recognition result and the target recognition result, adjust the parameters in the deep learning algorithm to reduce the target recognition error and improve the accuracy of the target recognition algorithm.

这样,通过重复上述步骤3200和步骤3300,得到识别模型。In this way, by repeating the above steps 3200 and 3300, the recognition model is obtained.

通常,深度学习采用的是卷积神经网络算法,卷积神经网络算法是多层的层次型神经网络,包含卷积层、池化层以及全连接层。如图4所示,卷积层的作用是提取出输入数据所含的特征,可用于图像滤波,提取出图像的局部特征信息。池化层的作用是增强对于输入信号偏差的鲁棒性,通常上将卷积层输出值的最大值作为池化层的输出值,最终通过全连接层得到深度学习的输出结果。Usually, deep learning uses the convolutional neural network algorithm, which is a multi-layer hierarchical neural network, including convolutional layers, pooling layers, and fully connected layers. As shown in Figure 4, the function of the convolution layer is to extract the features contained in the input data, which can be used for image filtering to extract the local feature information of the image. The function of the pooling layer is to enhance the robustness to the deviation of the input signal. Usually, the maximum value of the output value of the convolution layer is used as the output value of the pooling layer, and finally the output result of the deep learning is obtained through the fully connected layer.

在本实施例中,卷积神经网络的具体实现过程包括:卷积层中图像滤波器的初始化;卷积层中的图像滤波器针对输入数据计算各点像素加权和,对加权和求卷积,将结果保存至输出数据给池化层;池化层中的池化器将卷积层中的图像滤波器输出数据中选择一个点,在邻域进行搜索,选择其中的最大值作为输出数据给全连接层;全连接层的全连接器将输入数据根据权重和阈值,通过传递函数得到输出的目标结果。In this embodiment, the specific implementation process of the convolutional neural network includes: initialization of the image filter in the convolutional layer; the image filter in the convolutional layer calculates the weighted sum of pixels of each point for the input data, and convolves the weighted sum. , save the result to the output data to the pooling layer; the pooler in the pooling layer selects a point in the output data of the image filter in the convolutional layer, searches the neighborhood, and selects the maximum value as the output data To the fully connected layer; the fully connected layer of the fully connected layer uses the input data to obtain the output target result through the transfer function according to the weight and threshold.

根据本实施例,可以在接收到目标识别任务的情况下,获取环境光照强度;在所述环境光照强度小于预设值的情况下,获取目标对象的二维图像数据以及深度图像数据;基于所述二维图像数据以及深度图像数据得到所述目标识别任务的识别结果。这样,可以在环境光照强度小于预设值的情况下,基于二维图像数据和深度图像数据识别出目标对象,从而可以避免由于在光照强度较低的情况下采集到的目标对象的信息量较少而导致的目标识别率降低的情况,显著提升了目标识别率。According to this embodiment, the ambient light intensity can be obtained when the target recognition task is received; when the ambient light intensity is less than a preset value, the two-dimensional image data and the depth image data of the target object can be obtained; The two-dimensional image data and the depth image data are used to obtain the recognition result of the target recognition task. In this way, the target object can be identified based on the two-dimensional image data and the depth image data when the ambient light intensity is less than the preset value, so as to avoid the fact that the amount of information of the target object collected when the light intensity is low is relatively high. The target recognition rate is reduced due to less, and the target recognition rate is significantly improved.

<装置实施例><Apparatus Example>

本公开实施例提供了一种目标识别装置,设置于AR设备,如图5所示,该目标识别装置5000可以包括:获取模块5100和识别模块5200。An embodiment of the present disclosure provides a target recognition apparatus, which is set in an AR device. As shown in FIG. 5 , the target recognition apparatus 5000 may include: an acquisition module 5100 and an identification module 5200 .

其中,获取模块5100,用于在接收到目标识别任务的情况下,获取环境光照强度;在所述环境光照强度小于预设值的情况下,获取目标对象的二维图像数据以及深度图像数据。Wherein, the acquisition module 5100 is configured to acquire the ambient light intensity when the target recognition task is received; when the ambient light intensity is less than a preset value, acquire two-dimensional image data and depth image data of the target object.

识别模块5200,用于基于所述二维图像数据以及深度图像数据得到所述目标识别任务的识别结果。The recognition module 5200 is configured to obtain the recognition result of the target recognition task based on the two-dimensional image data and the depth image data.

在一个实施例中,所述获取模块5100还用于:在所述环境光照强度大于所述预设值的情况下,获取所述目标对象的二维图像数据;所述识别模块5200还用于:基于所述二维图像数据得到所述目标识别任务的识别结果。In one embodiment, the obtaining module 5100 is further configured to obtain the two-dimensional image data of the target object when the ambient light intensity is greater than the preset value; the identifying module 5200 is further configured to: : obtain the recognition result of the target recognition task based on the two-dimensional image data.

在一个实施例中,所述识别模块5200具体用于:基于第一目标识别算法对所述二维图像数据以及深度图像数据进行处理,得到所述目标识别任务的识别结果。In one embodiment, the recognition module 5200 is specifically configured to: process the two-dimensional image data and the depth image data based on the first target recognition algorithm to obtain the recognition result of the target recognition task.

在一个实施例中,所述识别模块5200具体用于:基于第二目标识别算法对所述二维图像数据进行处理,得到所述目标识别任务的识别结果。In one embodiment, the recognition module 5200 is specifically configured to: process the two-dimensional image data based on the second target recognition algorithm to obtain the recognition result of the target recognition task.

本实施例的目标识别装置,可用于执行上述方法实施例的技术方案,其实现原理和技术效果类似,在此不再赘述。The target identification device in this embodiment can be used to implement the technical solutions of the above method embodiments, and the implementation principles and technical effects thereof are similar, and are not repeated here.

<设备实施例><Apparatus Example>

图6是根据一个实施例的AR设备的硬件结构示意图。如图6所示,该AR设备6000可以包括环境光传感器6100,RGB相机6200以及飞行时间传感器6300,所述AR设备6000还可以包括:存储器6400,用于存储可执行的计算机指令;处理器6500,用于根据所述可执行的计算机指令的控制,执行根据上述实施例中所述的目标识别方法;其中,所述环境光传感器6100、所述RGB相机6200以及所述飞行时间传感器6300均与所述处理器6500连接。FIG. 6 is a schematic diagram of a hardware structure of an AR device according to an embodiment. As shown in FIG. 6 , the AR device 6000 may include an ambient light sensor 6100, an RGB camera 6200 and a time-of-flight sensor 6300. The AR device 6000 may further include: a memory 6400 for storing executable computer instructions; a processor 6500 , for executing the target recognition method according to the above embodiment according to the control of the executable computer instructions; wherein, the ambient light sensor 6100, the RGB camera 6200 and the flight time sensor 6300 are all connected with The processor 6500 is connected.

在另外的实施例中,该AR设备6000可以包括以上目标识别装置5000。In another embodiment, the AR device 6000 may include the above target recognition apparatus 5000 .

在一个实施例中,以上目标识别装置5000的各模块可以通过处理器6500运行存储器6400中存储的计算机指令实现。In one embodiment, each module of the above target identification apparatus 5000 may be implemented by the processor 6500 running computer instructions stored in the memory 6400 .

<计算机可读存储介质><Computer-readable storage medium>

本公开实施例还提供了一种计算机可读存储介质,其上存储有计算机指令,所述计算机指令被处理器运行时执行本公开实施例提供的头戴显示设备的检测方法。The embodiments of the present disclosure also provide a computer-readable storage medium, which stores computer instructions, and when the computer instructions are run by a processor, executes the detection method of the head-mounted display device provided by the embodiments of the present disclosure.

本公开可以是系统、方法和/或计算机程序产品。计算机程序产品可以包括计算机可读存储介质,其上载有用于使处理器实现本公开的各个方面的计算机可读程序指令。The present disclosure may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions loaded thereon for causing a processor to implement various aspects of the present disclosure.

计算机可读存储介质可以是可以保持和存储由指令执行设备使用的指令的有形设备。计算机可读存储介质例如可以是――但不限于――电存储设备、磁存储设备、光存储设备、电磁存储设备、半导体存储设备或者上述的任意合适的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、静态随机存取存储器(SRAM)、便携式压缩盘只读存储器(CD-ROM)、数字多功能盘(DVD)、记忆棒、软盘、机械编码设备、例如其上存储有指令的打孔卡或凹槽内凸起结构、以及上述的任意合适的组合。这里所使用的计算机可读存储介质不被解释为瞬时信号本身,诸如无线电波或者其他自由传播的电磁波、通过波导或其他传输媒介传播的电磁波(例如,通过光纤电缆的光脉冲)、或者通过电线传输的电信号。A computer-readable storage medium may be a tangible device that can hold and store instructions for use by the instruction execution device. The computer-readable storage medium may be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (non-exhaustive list) of computer readable storage media include: portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) or flash memory), static random access memory (SRAM), portable compact disk read only memory (CD-ROM), digital versatile disk (DVD), memory sticks, floppy disks, mechanically coded devices, such as printers with instructions stored thereon Hole cards or raised structures in grooves, and any suitable combination of the above. Computer-readable storage media, as used herein, are not to be construed as transient signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through waveguides or other transmission media (eg, light pulses through fiber optic cables), or through electrical wires transmitted electrical signals.

这里所描述的计算机可读程序指令可以从计算机可读存储介质下载到各个计算/处理设备,或者通过网络、例如因特网、局域网、广域网和/或无线网下载到外部计算机或外部存储设备。网络可以包括铜传输电缆、光纤传输、无线传输、路由器、防火墙、交换机、网关计算机和/或边缘服务器。每个计算/处理设备中的网络适配卡或者网络接口从网络接收计算机可读程序指令,并转发该计算机可读程序指令,以供存储在各个计算/处理设备中的计算机可读存储介质中。The computer readable program instructions described herein may be downloaded to various computing/processing devices from a computer readable storage medium, or to an external computer or external storage device over a network such as the Internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers, and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer-readable program instructions from a network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in each computing/processing device .

用于执行本公开操作的计算机程序指令可以是汇编指令、指令集架构(ISA)指令、机器指令、机器相关指令、微代码、固件指令、状态设置数据、或者以一种或多种编程语言的任意组合编写的源代码或目标代码,所述编程语言包括面向对象的编程语言—诸如Smalltalk、C++等,以及常规的过程式编程语言—诸如“C”语言或类似的编程语言。计算机可读程序指令可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络—包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。在一些实施例中,通过利用计算机可读程序指令的状态信息来个性化定制电子电路,例如可编程逻辑电路、现场可编程门阵列(FPGA)或可编程逻辑阵列(PLA),该电子电路可以执行计算机可读程序指令,从而实现本公开的各个方面。Computer program instructions for carrying out operations of the present disclosure may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state setting data, or instructions in one or more programming languages. Source or object code, written in any combination, including object-oriented programming languages, such as Smalltalk, C++, etc., and conventional procedural programming languages, such as the "C" language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server implement. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through the Internet connect). In some embodiments, custom electronic circuits, such as programmable logic circuits, field programmable gate arrays (FPGAs), or programmable logic arrays (PLAs), can be personalized by utilizing state information of computer readable program instructions. Computer readable program instructions are executed to implement various aspects of the present disclosure.

这里参照根据本公开实施例的方法、装置(系统)和计算机程序产品的流程图和/或框图描述了本公开的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.

这些计算机可读程序指令可以提供给通用计算机、专用计算机或其它可编程数据处理装置的处理器,从而生产出一种机器,使得这些指令在通过计算机或其它可编程数据处理装置的处理器执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer or other programmable data processing apparatus to produce a machine that causes the instructions when executed by the processor of the computer or other programmable data processing apparatus , resulting in means for implementing the functions/acts specified in one or more blocks of the flowchart and/or block diagrams. These computer readable program instructions can also be stored in a computer readable storage medium, these instructions cause a computer, programmable data processing apparatus and/or other equipment to operate in a specific manner, so that the computer readable medium on which the instructions are stored includes An article of manufacture comprising instructions for implementing various aspects of the functions/acts specified in one or more blocks of the flowchart and/or block diagrams.

也可以把计算机可读程序指令加载到计算机、其它可编程数据处理装置、或其它设备上,使得在计算机、其它可编程数据处理装置或其它设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其它可编程数据处理装置、或其它设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。Computer readable program instructions can also be loaded onto a computer, other programmable data processing apparatus, or other equipment to cause a series of operational steps to be performed on the computer, other programmable data processing apparatus, or other equipment to produce a computer-implemented process , thereby causing instructions executing on a computer, other programmable data processing apparatus, or other device to implement the functions/acts specified in one or more blocks of the flowcharts and/or block diagrams.

附图中的流程图和框图显示了根据本公开的多个实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,所述模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。对于本领域技术人员来说公知的是,通过硬件方式实现、通过软件方式实现以及通过软件和硬件结合的方式实现都是等价的。The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more functions for implementing the specified logical function(s) executable instructions. In some alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It is also noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented in dedicated hardware-based systems that perform the specified functions or actions , or can be implemented in a combination of dedicated hardware and computer instructions. It is well known to those skilled in the art that implementation in hardware, implementation in software, and implementation in a combination of software and hardware are all equivalent.

以上已经描述了本公开的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实施例的原理、实际应用或对市场中的技术改进,或者使本技术领域的其它普通技术人员能理解本文披露的各实施例。本公开的范围由所附权利要求来限定。Various embodiments of the present disclosure have been described above, and the foregoing descriptions are exemplary, not exhaustive, and not limiting of the disclosed embodiments. Numerous modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A target identification method is applied to an AR device, and comprises the following steps:
under the condition of receiving a target identification task, acquiring the ambient light intensity;
under the condition that the ambient illumination intensity is smaller than a preset value, acquiring two-dimensional image data and depth image data of a target object;
and obtaining the recognition result of the target recognition task based on the two-dimensional image data and the depth image data.
2. The method of claim 1, wherein after obtaining the ambient light intensity, the method further comprises:
under the condition that the ambient illumination intensity is larger than the preset value, acquiring two-dimensional image data of the target object;
and obtaining the recognition result of the target recognition task based on the two-dimensional image data.
3. The method of claim 1, wherein obtaining the recognition result of the target recognition task based on the two-dimensional image data and the depth image data comprises:
and processing the two-dimensional image data and the depth image data based on a first target recognition algorithm to obtain a recognition result of the target recognition task.
4. The method of claim 2, wherein deriving the recognition result of the object recognition task based on the two-dimensional image data comprises:
and processing the two-dimensional image data based on a second target identification algorithm to obtain an identification result of the target identification task.
5. An object recognition apparatus provided in an AR device, the apparatus comprising:
the acquisition module is used for acquiring the ambient light intensity under the condition of receiving the target identification task; under the condition that the ambient illumination intensity is smaller than a preset value, acquiring two-dimensional image data and depth image data of a target object;
and the identification module is used for obtaining an identification result of the target identification task based on the two-dimensional image data and the depth image data.
6. The apparatus of claim 5, wherein the obtaining module is further configured to:
under the condition that the ambient illumination intensity is larger than the preset value, acquiring two-dimensional image data of the target object;
the identification module is further configured to: and obtaining the recognition result of the target recognition task based on the two-dimensional image data.
7. The apparatus of claim 5, wherein the identification module is specifically configured to:
and processing the two-dimensional image data and the depth image data based on a first target recognition algorithm to obtain a recognition result of the target recognition task.
8. The apparatus of claim 6, wherein the identification module is specifically configured to:
and processing the two-dimensional image data based on a second target identification algorithm to obtain an identification result of the target identification task.
9. An AR device comprising an ambient light sensor, an RGB camera, and a time-of-flight sensor, the AR device further comprising:
a memory for storing executable computer instructions;
a processor for executing the object recognition method according to any one of claims 1 to 4 under the control of the executable computer instructions;
wherein the ambient light sensor, the RGB camera, and the time of flight sensor are all connected to the processor.
10. A readable storage medium, on which a program or instructions are stored, which when executed by a processor, carry out the steps of the object identification method according to any one of claims 1 to 4.
CN202111646423.7A 2021-12-29 2021-12-29 Target identification method and device, AR device and readable storage medium Pending CN114419469A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111646423.7A CN114419469A (en) 2021-12-29 2021-12-29 Target identification method and device, AR device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111646423.7A CN114419469A (en) 2021-12-29 2021-12-29 Target identification method and device, AR device and readable storage medium

Publications (1)

Publication Number Publication Date
CN114419469A true CN114419469A (en) 2022-04-29

Family

ID=81268753

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111646423.7A Pending CN114419469A (en) 2021-12-29 2021-12-29 Target identification method and device, AR device and readable storage medium

Country Status (1)

Country Link
CN (1) CN114419469A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119292304A (en) * 2024-10-09 2025-01-10 新一代信息技术江苏有限公司 UAV control method and system based on artificial intelligence

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105408938A (en) * 2013-02-28 2016-03-16 谷歌技术控股有限责任公司 System for 2D/3D spatial feature processing
CN108989783A (en) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 Electronic device and control method for electronic device
CN110458041A (en) * 2019-07-19 2019-11-15 国网安徽省电力有限公司建设分公司 A method and system for face recognition based on RGB-D camera
CN110853127A (en) * 2018-08-20 2020-02-28 浙江宇视科技有限公司 Image processing method, device and equipment
CN112365530A (en) * 2020-11-04 2021-02-12 Oppo广东移动通信有限公司 Augmented reality processing method and device, storage medium and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105408938A (en) * 2013-02-28 2016-03-16 谷歌技术控股有限责任公司 System for 2D/3D spatial feature processing
CN110853127A (en) * 2018-08-20 2020-02-28 浙江宇视科技有限公司 Image processing method, device and equipment
CN108989783A (en) * 2018-08-22 2018-12-11 Oppo广东移动通信有限公司 Electronic device and control method for electronic device
CN110458041A (en) * 2019-07-19 2019-11-15 国网安徽省电力有限公司建设分公司 A method and system for face recognition based on RGB-D camera
CN112365530A (en) * 2020-11-04 2021-02-12 Oppo广东移动通信有限公司 Augmented reality processing method and device, storage medium and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN119292304A (en) * 2024-10-09 2025-01-10 新一代信息技术江苏有限公司 UAV control method and system based on artificial intelligence
CN119292304B (en) * 2024-10-09 2025-08-05 新一代信息技术江苏有限公司 Artificial intelligence-based drone control method and system

Similar Documents

Publication Publication Date Title
CN107766839B (en) Motion recognition method and device based on 3D convolutional neural network
CN111722245B (en) Positioning method, positioning device and electronic equipment
CN111739005B (en) Image detection method, device, electronic device and storage medium
WO2020063475A1 (en) 6d attitude estimation network training method and apparatus based on deep learning iterative matching
CN111767853B (en) Lane line detection method and device
CN113920307A (en) Model training method, device, equipment, storage medium and image detection method
KR20210013150A (en) Lighting estimation
CN112529073A (en) Model training method, attitude estimation method and apparatus, and electronic device
CN111753961A (en) Model training method and device, prediction method and device
CN111291650A (en) Method and device for automatic parking assistance
CN113869429A (en) Model training method and image processing method
CN115578515B (en) Training method of three-dimensional reconstruction model, three-dimensional scene rendering method and device
CN114419564B (en) Vehicle pose detection method, device, equipment, medium and automatic driving vehicle
US11842496B2 (en) Real-time multi-view detection of objects in multi-camera environments
CN110427915B (en) Method and apparatus for outputting information
CN110717933A (en) Post-processing method, device, equipment and medium for moving object missed detection
CN115511779A (en) Image detection method, device, electronic device and storage medium
CN115375742A (en) Method and system for generating depth image
CN111833391B (en) Image depth information estimation method and device
CN115205806A (en) Method and device for generating target detection model and automatic driving vehicle
CN113705390A (en) Positioning method, positioning device, electronic equipment and storage medium
CN112329732A (en) Model generation method and device, electronic equipment and storage medium
CN112488126A (en) Feature map processing method, device, equipment and storage medium
CN116721139A (en) Generating depth images of image data
CN111866492A (en) Image processing method, device and device based on head-mounted display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221121

Address after: No. 500, Songling Road, Laoshan District, Qingdao, Shandong 266101

Applicant after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261061 workshop 1, phase III, Geer Photoelectric Industrial Park, 3999 Huixian Road, Yongchun community, Qingchi street, high tech Zone, Weifang City, Shandong Province

Applicant before: GoerTek Optical Technology Co.,Ltd.

TA01 Transfer of patent application right