CN120416321A - Device, method and computer program product for object tracing - Google Patents
Device, method and computer program product for object tracingInfo
- Publication number
- CN120416321A CN120416321A CN202410124157.9A CN202410124157A CN120416321A CN 120416321 A CN120416321 A CN 120416321A CN 202410124157 A CN202410124157 A CN 202410124157A CN 120416321 A CN120416321 A CN 120416321A
- Authority
- CN
- China
- Prior art keywords
- identification code
- serial number
- image
- cloud
- module
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
- G06Q10/083—Shipping
- G06Q10/0833—Tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/08—Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/80—Recognising image objects characterised by unique random patterns
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Economics (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Human Resources & Organizations (AREA)
- Quality & Reliability (AREA)
- Tourism & Hospitality (AREA)
- Entrepreneurship & Innovation (AREA)
- General Business, Economics & Management (AREA)
- Development Economics (AREA)
- Computer Security & Cryptography (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- General Health & Medical Sciences (AREA)
- Bioethics (AREA)
- Health & Medical Sciences (AREA)
- Databases & Information Systems (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
Embodiments of the present disclosure provide a registration device, a retrieval device, a cloud device, a corresponding method, and a computer-readable storage medium for object tracing. The registration device may include a visual work module configured to capture an image of a first object to be traced, a control work module configured to generate an identification code for representing the first object based on the image of the first object, and send the identification code of the first object and a serial number of the first object to the cloud device in a manner corresponding to each other for storage by the cloud device.
Description
Technical Field
The present disclosure relates to the field of industry, in particular to a registration device, a retrieval device, a cloud device, a corresponding method and a computer program product for object tracing.
Background
With the rapid development of internet technology, industrial manufacturing processes have also become more digital, networked and intelligent, for example, the "industry 4.0" strategy that has been proposed is to relate virtual and physical worlds by the information physical system (cyber-PGYSICAL SYSTEM, CPS). In modern industrial manufacturing processes, product traceability is an important aspect that provides a digital connection between a material stream and an associated information stream. The product traceability can track the condition of the product or the physical component thereof in each processing stage, thereby being beneficial to quality control, processing decision and optimization of the product or the physical component thereof, and the like. Therefore, how to optimize the trace-back becomes one of the problems to be solved.
Disclosure of Invention
In view of the need for improvements in the prior art, embodiments of the present disclosure provide a registration device, a retrieval device, a cloud device, a corresponding method, and a computer-readable storage medium for object tracing.
In one aspect, an embodiment of the disclosure provides a registration device for object tracing, which comprises a visual work module configured to capture an image of a first object to be traced, a control work module configured to generate an identification code for representing the first object based on the image of the first object, and send the identification code of the first object and a serial number of the first object to a cloud device in a manner corresponding to each other so that the cloud device can store the identification code of the first object.
In another aspect, an embodiment of the disclosure provides a retrieval device for object tracing, including a visual work module configured to capture an image of a second object to be traced, a control work module configured to generate an identification code for representing the second object based on the image of the second object, and send the identification code of the second object to a cloud device so that the cloud device outputs a serial number corresponding to the identification code of the second object as a serial number of the second object.
In another aspect, embodiments of the present disclosure provide a cloud device for object tracing, including a cloud database configured to store a serial number and an identification code corresponding to each other, a storage module configured to receive the identification code and the serial number of a first object from a registration device, store the identification code and the serial number of the first object to the cloud database in a manner corresponding to each other, and a retrieval module configured to receive an identification code of a second object from a retrieval device, retrieve the serial number corresponding to the identification code of the second object from the cloud database as the serial number of the second object based on the identification code of the second object, and output the serial number of the second object.
On the other hand, the embodiment of the disclosure provides a system for object tracing, which comprises the registration device, the retrieval device and the cloud device.
In another aspect, an embodiment of the disclosure provides a method for object tracing, which is performed by a registration device, and includes capturing an image of a first object to be traced, generating an identification code for representing the first object based on the image of the first object, and transmitting the identification code of the first object and a serial number of the first object to a cloud device in a manner corresponding to each other so that the cloud device stores the identification code.
In another aspect, an embodiment of the disclosure provides a method for object tracing, which is performed by a search device, and includes capturing an image of a second object to be traced, generating an identification code for representing the second object based on the image of the second object, and transmitting the identification code of the second object to a cloud device so that the cloud device outputs a serial number corresponding to the identification code of the second object as the serial number of the second object.
In another aspect, embodiments of the present disclosure provide a method for object tracing, which is performed by a cloud device, and includes receiving an identification code and a serial number of a first object from a registration device and storing the identification code and the serial number of the first object in a cloud database in a manner corresponding to each other, and/or receiving an identification code of a second object from a retrieval device, retrieving, based on the identification code of the second object, the serial number corresponding to the identification code of the second object from the cloud database as the serial number of the second object, and outputting the serial number of the second object, wherein the cloud database is used for storing the serial number and the identification code corresponding to each other.
In another aspect, embodiments of the present disclosure provide a registration device for object tracing, comprising a visual work module, a control work module comprising at least one processor, a memory in communication with the at least one processor, having stored thereon executable code that, when executed by the at least one processor, causes the at least one processor to control the visual work module and the control work module to perform the method described above with respect to the registration device.
In another aspect, embodiments of the present disclosure provide a retrieval device for object tracing, comprising a visual work module, a control work module comprising at least one processor, a memory in communication with the at least one processor, having stored thereon executable code that, when executed by the at least one processor, causes the at least one processor to control the visual work module and the control work module to perform the method described above with respect to the retrieval device.
In another aspect, embodiments of the present disclosure provide a cloud device for object tracing comprising at least one processor, a memory in communication with the at least one processor, having stored thereon executable code that, when executed by the at least one processor, causes the at least one processor to perform the method described above with respect to the cloud device.
In another aspect, embodiments of the present disclosure provide a computer-readable storage medium storing executable code that, when executed, implements the method described above with respect to a registration device, a retrieval device, or a cloud device.
In another aspect, embodiments of the present disclosure provide a computer program product comprising a computer program that, when executed, implements the method described above with respect to a registration device, a retrieval device, or a cloud device.
Drawings
The foregoing and other objects, features and advantages of embodiments of the disclosure will be apparent from the following more particular descriptions of embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent like elements throughout the various drawings.
Fig. 1 is a schematic block diagram of a system for object tracing in accordance with some embodiments.
Fig. 2A is a schematic block diagram of a registration device according to some embodiments.
Fig. 2B is a schematic block diagram of one implementation of a visual work module.
Fig. 3A is a schematic block diagram of a retrieval device according to some embodiments.
Fig. 3B is a schematic block diagram of one implementation of a visual work module.
Fig. 4 is a schematic block diagram of a cloud device according to some embodiments.
Fig. 5A shows a schematic diagram of one exemplary implementation of a visual work module in a registration device or a retrieval device.
Fig. 5B shows an exemplary processing procedure for an image of a magnet.
Fig. 6 is a schematic diagram of one example of an application scenario of an embodiment of the present disclosure.
Fig. 7 is a schematic flow chart diagram of a method for object tracing in accordance with some embodiments.
Fig. 8 is a schematic flow chart diagram of a method for object tracing in accordance with some embodiments.
Fig. 9 is a schematic flow chart diagram of a method for object tracing in accordance with some embodiments.
Fig. 10 is a schematic block diagram of a control work module in a registration device or a control work module in a retrieval device, according to some embodiments.
Fig. 11 is a schematic block diagram of a cloud device according to some embodiments. For example, cloud device 1000 may be one exemplary implementation of cloud device 400.
Detailed Description
The subject matter described herein will now be discussed with reference to various embodiments. It should be appreciated that these embodiments are discussed only to enable those skilled in the art to better understand and practice the subject matter described herein and are not limiting on the scope, applicability, or examples set forth in the claims.
In the industrial field, traceability of physical components is a very important link to quality management of the physical components and the products to which they are assembled. For ease of description, the physical components that need to be traced are collectively referred to herein as physical objects or objects. The trace back may generally include batch trace back, single product trace back, or a mixture of both. Batch tracing is typically tracing a batch of physical objects, such as adding the same batch number to a batch of physical objects. Batch traceability is relatively low cost, but may not be well suited for single-piece flow production processes. Furthermore, since lot traceback is for a batch of physical objects, which may be from different lots during previous processing, it may be difficult to find the root cause by lot traceback if quality issues subsequently occur. In contrast, single-product traceback can be a finer granularity traceback, which typically tracebacks a single physical object, thereby enabling more accurate traceback information to be provided.
Currently, for single product traceability, an identification unique to a physical object is generally added through an indirect marking mode or a direct marking mode. For example, in the indirect marking method, a carrier such as a printed label or the like with its unique identification (e.g., plain text, one-dimensional code, two-dimensional code, radio frequency identification code, etc.) may be attached or affixed to the physical object. In direct marking, its unique identification may be added directly on the physical object itself, such as mechanical engraving, laser marking, inkjet printing, electrochemical marking, etc. Both direct and indirect marking have certain requirements on the size or shape of the physical object, and the cost is relatively high. Furthermore, with indirect labeling, the carrier with the unique identifier is easily lost during subsequent transportation, which may result in an inability to trace back the physical object. For the direct marking mode, a certain damage may be caused to the surface of the physical object, which affects the performance of the physical object.
In view of this, embodiments of the present disclosure provide a solution for object tracing, which is implemented based on images, thus enabling non-invasive, contact-free object tracing with a relatively low cost. The following description will be made with reference to specific embodiments.
Fig. 1 is a schematic block diagram of a system for object tracing in accordance with some embodiments.
As shown in fig. 1, the system 100 may include a registration device 200, a retrieval device 300, and a cloud device 400.
The registration device 200 may connect to the cloud device 400 to communicate with the cloud device 400. Communication between the registration device 200 and the cloud device 400 may be achieved in a variety of suitable ways, such as based on wired or wireless communication standards.
The recall device 300 may also be connected to the cloud device 400 for communication with the cloud device 400. Communication between the recall device 300 and the cloud device 400 may also be achieved in a variety of suitable ways, such as based on wired or wireless communication standards.
In general, the registration device 200 may perform image capturing on an object to be traced, generate an identification code for representing the object based on the captured image, and then transmit the identification code and serial number of the object to the cloud device 400. The identification code of the object corresponds to the serial number of the object. The cloud device 400 may store the received identification code and serial number in a manner corresponding to each other. This may also be understood as registering an object that needs to be traced to the cloud device 400. In this way, the cloud device 400 can store the identification code and the serial number corresponding to each other.
The retrieval device 300 may perform image capturing on an object to be traced, generate an identification code for representing the object based on the captured image, and then transmit the identification code of the object to the cloud device 400. The cloud device 400 may retrieve the serial number corresponding thereto based on the identification code received from the retrieval device 300 and output the retrieved serial number, e.g., the cloud device 400 may send or store the retrieved serial number to the manufacturing execution system (Manufacturing Execution System, MES). After the MES obtains the serial number, more information about the object, such as various processing information of the object, etc., can be obtained based on the serial number. In addition, the cloud device 400 may also send the retrieved serial number to the retrieval device 300 for recording. Of course, the cloud device 400 may also send the retrieved serial number to other production systems associated with the object to be traced for subsequent use.
For example, for the same target device, the registration device 200 may capture an image of the target object after the target object is processed on the first production line, generate an identification code of the target object, and send the identification code of the target object and a serial number of the target object to the cloud device 400 in a manner corresponding to each other. The cloud device 400 may store the identification code of the target object and the serial number of the target object in a manner corresponding to each other.
The retrieval device 300 may capture an image of the target object after the target object is transported to the second production line and before the target object is processed on the second production line, generate an identification code of the target object, and transmit the identification code of the target object to the cloud device 400.
The cloud device 400 may retrieve a serial number of the target object based on the identification code of the target object received from the retrieval device 300 and transmit the serial number of the target object to a production system associated with the target object so that the production system performs an operation on the target object based on the serial number of the target object. In embodiments of the present disclosure, the production system may be a system associated with the first and second production lines (e.g., database system, control system, etc.) that may be used to record process information associated with the first and second production lines, control the first and second production lines, etc. The specific form of the production system can be determined according to the actual application scenario and service requirements, and is not limited herein.
It can be seen that by such a system, objects can be traced efficiently and reliably without special requirements on the shape or size of the objects, and without damaging the surface of the objects, and the cost can be greatly reduced.
In addition, it can be seen that in the technical solution provided by the embodiments of the present disclosure, the tracing is performed based on the image, so that the tracing can be also understood as the tracing of the object based on the natural fingerprint (Natural Fingerprint).
Embodiments of the present disclosure may apply to various scenarios. For example, embodiments of the present disclosure may be applied to tracking of physical components within a production plant, particularly where physical components need to be processed on different production lines that cannot directly share processing information with one another, e.g., physical components need to be transported to line B for processing after processing on line a is complete. Production line a and production line B cannot directly share processing information. In this case, by embodiments of the present disclosure, the relevant information of the physical component can still be traced back efficiently and reliably at the production line B.
The registration device 200, the cloud device 400, and the retrieval device 300 of fig. 1 will be described below in connection with specific embodiments, respectively.
Fig. 2A is a schematic block diagram of a registration device according to some embodiments.
As shown in fig. 2A, the registration device 200 may include a visual work module 210 and a control work module 220.
The vision working module 210 may capture an image of the first object that needs to be traced. The control job module 220 may generate an identification code for representing the first object based on the image of the first object. The control work module 220 may send the identification code of the first object and the serial number of the first object to the cloud device in a manner corresponding to each other, so that the cloud device stores the identification code and the serial number of the first object.
In such an embodiment, the identification code of the object is obtained based on the image, and the identification code and the serial number of the object are sent to the cloud device in a manner corresponding to each other for storage, so that the information of the object can be efficiently registered to the cloud device for subsequent tracing. In the process, no special requirement is made on the shape or size of the object, and the problems of damaging the surface of the object, losing the object identification and the like can be avoided. In addition, this approach also greatly reduces costs.
In some embodiments, registration device 200 may be installed at a location after the last station of the first production line such that vision work module 210 captures an image of the first object after the first object passes through the last station.
For example, in some embodiments, the registration device 200 may be installed at a location where the first object is finished, such that the vision working module 210 captures an image of the first object after the first object is finished. For example, the registration device 200 may be installed at a quality check station of the first object.
In some embodiments, the visual work module 210 may include various suitable components or units for enabling image capture. For example, FIG. 2B is a schematic block diagram of one implementation of the visual work module 210. As shown in fig. 2B, the visual work module 210 may include a light source 211, a lens 212, and an imaging unit 213.
The light source 211 may provide illumination for the first object. For example, the light source 211 may illuminate the first object, which is advantageous for improving the imaging effect. The Light source 211 may be implemented using any suitable Light source, such as high frequency fluorescent Light, etc., a fiber halogen lamp, a Light-Emitting Diode (LED) lamp, etc.
The lens 212 and the imaging unit 213 may work together to capture an image of the first object under illumination provided by the light source 211. The lens 212 may be implemented using any suitable lens. The imaging unit 213 may comprise any suitable imaging device, such as a charge coupled device (Charge Coupled Device, CCD) or a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS) based imaging device.
The specific implementation of the light source 211, the lens 212 and the imaging unit 213 depends on the actual application scenario, service requirements, etc., which are not limited herein.
In some embodiments, as shown in fig. 2B, the vision working module 210 may further include a positioning unit 214. The positioning unit 214 may fix the first object to a preset position. The preset position may be a position advantageous for capturing an image of the first object. The light source 211 may provide illumination to the first object after the positioning unit 214 fixes the first object to a preset position. Accordingly, in this case, the lens 212 and the imaging unit 213 may be triggered to work together to capture an image of the first object under illumination provided by the light source 211.
In some implementations, the visual work module 210 may operate under the control of the control work module 220. For example, the control work module 220 may control the positioning unit 214 of the vision work module 210 to fix the first object to a preset position. Thereafter, the control work module 220 may control the light source 211 to provide illumination for the first object. Then, the control operation module 220 may control the lens 212 and the imaging unit 213 to operate together to capture an image of the first object under illumination provided by the light source 211.
Only some embodiments of the visual work module 210 are described above, however, it should be understood that in different implementations, the visual work module 210 may also include other units or devices, which are not limited in this regard.
The control job module 220 may process the image of the first object to obtain an identification code for representing the first object.
For example, in some embodiments, the control work module 220 may extract a key region image from an image of the first object. The key region image includes an image of a key region of the first object, the key region including natural textures of the first object. The control operation module 220 may process the key region image using a feature extraction algorithm based on fine grain texture to obtain the identification code.
The extraction of the key region image may be implemented in a suitable manner, for example, the control job module 220 may use a predefined region of interest (i.e., a key region) on the first object to crop the image of the first object, so as to obtain the key region image. Unlike indirect and direct marking methods, which require the addition of a logo on the object itself or on a carrier attached to the object, the critical area of embodiments of the present invention is a selected area on the first object that contains the natural texture (Natural fingerprint) of the first object in that selected area, i.e., the texture characteristics of the first object itself, without relying on the logo added to the first object in order to achieve non-invasive, contactless object tracing. In some embodiments, the key region image may be a fine grain texture region that facilitates unique identification of the first object. The key region image may also be understood or referred to as a fingerprint region (FINGERPRINT AREA) of the first object.
The identification code based on the key area image may be implemented in any suitable manner. For example, the control job module 220 may process the key region images using a convolutional neural network (Convolutional Neural Network, CNN), obtain output results of at least two layers in the CNN, and combine the output results to generate the identification code.
In some embodiments, the surface of the first object may be functionally characterized, such as being surface treated with a special process to provide corrosion resistance, high pressure resistance, abrasion resistance, etc. . As mentioned before, it is conventionally possible to mark directly on the surface of a physical object. However, for a physical object with a functional surface, direct marking on the functional surface may affect the function and quality of the object. For example, laser coding, electrochemical marking, etc. on such surfaces may impair their functional properties. In these embodiments of the present disclosure, however, damage to the surface of the object can be effectively avoided by generating an identification code for the object that enables tracing by using an image containing the natural texture of the object. The surface of the first object may be a metallic material or a magnetic material.
The control work module 220 may send the identification code and the serial number of the first object to the cloud device in a manner corresponding to each other. For example, the control job module 220 may combine the serial number of the first object with the identification code of the first object to form a key-value pair of the first object, where a key may represent the serial number of the first object and a value may represent the identification code of the first object. The control job module 220 may then send the key value pair of the first object to the cloud device for storage by the cloud device.
In some implementations, the control work module 220 may be implemented by an industrial control computer or the like.
Fig. 3A is a schematic block diagram of a retrieval device according to some embodiments.
As shown in fig. 3A, the recall device 300 may include a visual work module 310 and a control work module 320.
The vision working module 310 may capture an image of the second object that needs to be traced.
The control job module 320 may generate an identification code for representing the second object based on the image of the second object.
The control work module 320 may send the identification code of the second object to the cloud device, so that the cloud device outputs the serial number corresponding to the identification code of the second object as the serial number of the second object.
In such an embodiment, when a specific object needs to be traced, an image of the object is captured, then an identification code is generated based on the image of the object, and the identification code is further sent to the cloud device so that the cloud device outputs a corresponding serial number, thereby being capable of efficiently and reliably realizing object tracing.
In some embodiments, the retrieval device 300 may be mounted at a location on the second production line prior to the first station such that the vision work module 310 may capture an image of the second object before the second object enters the first station.
For example, in some embodiments, the retrieval device 300 may be installed at a location prior to the second object being assembled to the final product. In this way, the visual work module 310 may capture an image of the second object before the second object is assembled into the final product.
The visual work module 310 in the recall device 300 is similar to the visual work module 210 in the enrolled device 200. For example, FIG. 3B is a schematic block diagram of one implementation of the visual work module 310. As shown in fig. 3B, the visual work module 310 may include a light source 311, a lens 312, and an imaging unit 313.
The light source 311 may provide illumination for the second object. For example, the light source 311 may illuminate the second object, which may be advantageous for improving the imaging effect. The light source 311 may be implemented using any suitable light source, such as high frequency fluorescent light, etc., fiber halogen lamps, LED lamps, etc.
The lens 312 and the imaging unit 313 may work together to capture an image of the second object under illumination provided by the light source 311. The lens 312 may be implemented using any suitable lens. The imaging unit 313 may comprise any suitable imaging device, such as a CCD, CMOS based imaging device.
The specific implementation of the light source 311, the lens 312, and the imaging unit 313 depends on the actual application scenario, service requirements, etc., which are not limited herein.
In some embodiments, as shown in fig. 3B, the visual work module 310 may further include a positioning unit 314. The positioning unit 314 may fix the second object to a preset position. The preset position may be a position advantageous for capturing an image of the second object. The light source 311 may provide illumination to the second object after the positioning unit 314 fixes the second object to a preset position. Accordingly, in this case, the lens 312 and the imaging unit 313 may be triggered to work together to capture an image of the second object under the illumination provided by the light source 311.
In some implementations, the visual work module 310 may operate under the control of the control work module 320. For example, the control work module 320 may control the positioning unit 314 of the vision work module 310 to fix the second object to a preset position. Thereafter, the control work module 320 may control the light source 311 to provide illumination for the second object. The control work module 320 may then control the lens 312 and the imaging unit 313 to work together to capture an image of the second object under the illumination provided by the light source 311.
Only some embodiments of the visual work module 310 are described above, however, it should be understood that in different implementations, the visual work module 310 may also include other units or devices, which are not limited in this regard.
The control job module 320 may process the image of the second object to obtain the identification code of the second object.
For example, the control job module 320 may extract a key region image from the image of the second object. The control job module 320 may process the key region image using a feature extraction algorithm based on fine grain texture to obtain the identification code of the second object.
The extraction of the key region image may be implemented in a suitable manner, for example, the control job module 320 may use a predefined region of interest (i.e., the key region) to crop the image of the second object to obtain the key region image. The key region may be a selected region on the second object that contains the natural texture of the second object in that selected region, i.e. the texture features of the second object itself, without relying on an identification added additionally to the second object. For example, the key region image may be a fine grain texture region that facilitates unique identification of the second object. The key region image may also be understood or referred to as a fingerprint region.
The identification code based on the key area image may be implemented in any suitable manner. For example, the control job module 320 may process the key region image with the CNN, obtain output results of at least two layers in the CNN, and combine the output results to generate the identification code.
The control work module 320 may send the identification code of the second object to the cloud device, so that the cloud device outputs the serial number corresponding to the identification code of the second object as the serial number of the second object.
In some embodiments, the control job module 320 can also receive the serial number of the second object from the cloud device for storage as a record for subsequent querying or use.
In some implementations, the control work module 320 may be implemented using an industrial control computer or the like.
While some of the modules of the registration device and the retrieval device are described above, it should be understood that in actual practice, the registration device or the retrieval device may include other additional modules, such as a user graphical interface (GRAPHICAL USER INTERFACE, GUI), a web service module, a logging module, a resource monitoring module, a containerized software module (such as a Docker), etc., which are not limited in this regard.
Fig. 4 is a schematic block diagram of a cloud device according to some embodiments.
As shown in fig. 4, cloud device 400 may include a storage module 410, a retrieval module 420, and a cloud database 430. Both the storage module 410 and the retrieval module 420 may be connected to a cloud database 430. The cloud database 430 may store serial numbers and identification codes corresponding to each other.
The storage module 410 may be in communication with a registration device. The storage module 410 may receive the identification code and the serial number of the first object from the registration device and store them in the cloud database 430 in a manner corresponding to each other.
The retrieval module 420 may be in communication with a retrieval device. The retrieval module 420 may receive the identification code of the second object from the retrieval device, and retrieve a serial number corresponding to the identification code of the second object as the serial number of the second object from the cloud database 430 based on the identification code of the second object. The retrieval module 420 may output the sequence number of the second object.
In such an embodiment, the cloud device may store the identification code and the serial number of the object to be traced by using the cloud database, and when the search is required, the cloud device may search the corresponding serial number based on the identification code. Thus, object tracing can be efficiently and reliably realized, and cost can be greatly saved.
In some embodiments, cloud database 430 may store serial numbers and identification codes corresponding to each other in key-value pairs. In each key-value pair, a key represents a serial number and a value represents an identification code corresponding to the serial number.
For example, cloud database 430 may store a plurality of entries, each entry being a key-value pair. For example, the KEY VALUE pair may be expressed as { key=serial number; value=identification number }.
In some embodiments, the storage module 410 may receive a key-value pair of the first object from the registration device, wherein in the key-value pair of the first object, a key may represent a serial number of the first object and a value may represent an identification code of the first object. The storage module 410 may store the key value pairs of the first object into the cloud database 430.
In some embodiments, after receiving the identification code of the second object from the retrieval device 102, the retrieval module 124 may determine a proximity between the identification code of the second object and each identification code stored in the cloud database 430, and may use a serial number corresponding to an identification code having a highest proximity to the identification code of the second object as the serial number of the second object.
As previously described, the identification code is generated based on the image, and thus the identification code is a vector in nature. Thus, determining the proximity between the identification code of the second object and each identification code stored in the cloud database 430 may be understood as determining the distance between vectors. The distance between vectors may be measured in a variety of suitable ways. For example, in some embodiments, a Euclidean distance, i.e., an L2 distance, between the identification code of the second object and each identification code stored in the cloud database 430 may be determined. The serial number corresponding to the identification code having the smallest euclidean distance of the identification code of the second object may be used as the serial number of the second object.
In some embodiments, the retrieval module 420 may output the serial number of the second object to any system or device that requires this information. For example, the retrieval module 420 may send the serial number of the second object to a production system associated with the second object for subsequent use. For example, the retrieval module 420 can send the sequence number of the second object to the MES. In this way, the MES can perform subsequent operations on the second object based on the serial number of the second object, e.g., obtain previous processing information for the second object, correlate the serial number of the second object with the serial number information of the final product to which the second object is assembled, and so forth.
For ease of understanding, embodiments of the present disclosure will be further described below in connection with specific examples. It should be understood that these examples do not set any limit to the scope of the present disclosure.
In the following example, it is assumed that the object to be traced is a magnet. The magnets may be an important component of a component whose processing parameters and/or conditions generally have an important effect on the quality of the component, so that the traceability of the magnets will be effective in improving the quality of the component. However, the magnets are typically smaller. In this case, if the aforementioned indirect marking method or direct marking method is used, it is difficult to efficiently perform marking of the magnet. For example, it may be inconvenient to attach additional labels due to the small physical dimensions of the magnet, and directly labeling the magnet may easily damage the surface of the magnet, affecting the performance of the magnet. In view of this, the traceability of the magnet may be achieved with the technical solutions provided by the embodiments of the present disclosure. The magnet is, for example, a magnet.
Fig. 5A shows a schematic diagram of one exemplary implementation of a visual work module in a registration device or a retrieval device. The visual work module 500 in fig. 5A may be provided in a registration device or in a retrieval device to image capture the magnet.
In the example of fig. 5A, the vision working module 500 may include a light source 502, a lens 504, an imaging unit 506, and a positioning unit 508.
The positioning unit 508 may include a metal bracket and a clip. The clips may secure the magnets to the top of the metal bracket in a fixed orientation. After the positioning unit 508 secures the magnet, the light source 502 may be turned on to provide illumination to the magnet. The lens 504 and imaging unit 506 may capture an image of the magnet.
Only some of the main units or devices of the vision working module are described above. It should be appreciated that the vision working module may also include some other units or devices, as also shown in fig. 5A, which is not limited herein.
Fig. 5B shows an exemplary processing procedure for an image of a magnet.
In the example of fig. 5B, stages 540 through 542 are performed at registration device 200 to register the serial number and identification number of the magnet with cloud device 400. Stage 550 through stage 551 are performed at the retrieval device 300 to retrieve the serial number of the magnet from the cloud device 400.
In stage 540, the enrolling device 200 may extract a key region image, i.e., a fingerprint region 521, from the image 520 of the magnet. For example, the image 520 may be captured for a magnet using the vision work module 500 shown in fig. 5A.
In stage 541, the enrolling device 200 may process the fingerprint area 521 using CNN to obtain the identification code 522 of the magnet. In the example of fig. 5B, the identification code 522 is assumed to be represented as [ 0.020.36.2.56..0.02-3.5 ].
At stage 542, the registration device 200 may combine the identification code 522 of the magnet with the serial number of the magnet to form a key-value pair 523 of the magnet, where the key may represent the serial number of the magnet and the value may represent the identification code of the magnet. In the example of fig. 5B, it is assumed that the key-value pair 523 is represented as { key=20211028312; value= [0.02 0.36.2.56..0.02-3.5 ] }. That is, the serial number of the magnet may be 20211028312.
The registration device 200 may then send the key value pair 523 of the magnet to the cloud device 400 for storage.
As shown in fig. 5B, the cloud device 400 may have a cloud database, where the cloud database stores serial numbers and identification codes corresponding to each other in the form of key value pairs.
Thereafter, when a magnet needs to be retrieved at a certain time, the retrieval device 300 may capture an image 530 of the magnet using the visual work module 500 shown in fig. 5A. In stage 550, the retrieval device 300 may extract the fingerprint region 531 from the image 530 of the magnet. In stage 551, the retrieval device 300 processes the fingerprint area 531 with CNN, resulting in an identification code 532. For example, assume that the identification code 532 is [ 0.02.36.2.56..0.02-3.5 ].
Thereafter, the recall device 300 may send the identification code 532 to the cloud device 400.
The cloud device 400 may determine the euclidean distance between the identification code 532 received from the search device 300 and each identification code stored in the cloud database, take the serial number corresponding to the identification code having the smallest euclidean distance as the serial number of the magnet, and output the serial number. As shown in fig. 5B, the serial number 20211028312. The cloud end 400 can send the serial number to the MES.
For ease of understanding, in the example of fig. 5B, the identification code obtained by the retrieval device and the identification code obtained by the registration device are expressed as the same vector. However, it should be appreciated that in practice the identification code derived by the retrieving device may not be exactly the same as the identification code derived by the registering device for the same object, because the image capturing and/or the selection of the key area image/fingerprint area may not be exactly identical, but in general the euclidean distance between the identification code derived by the retrieving device and the identification code derived by the registering device will be minimal for the same object.
Fig. 6 is a schematic diagram of one example of an application scenario of an embodiment of the present disclosure.
In the example of fig. 6, at stage 610, the object is processed in a production line to enter a testing stage. At this time, the registration device 200 may generate an identification code of the object and register the serial number and the identification code of the object to the cloud device 400. Thereafter, at stage 620, the object may be transported to another production line for processing (e.g., transported to another production line for product assembly) via in-plant logistics transport. At this point (e.g., before transporting the object to another production line for product assembly), the retrieval device 300 may generate and send the identification code of the object to the cloud device 400. Cloud device 400 can retrieve the corresponding serial number and send the serial number to MES 650.MES 650 can obtain prior process information about the object based on the serial number, associate the serial number with the serial number information of the product to which the object is to be assembled, and so forth.
Fig. 7 is a schematic flow chart diagram of a method for object tracing in accordance with some embodiments. The method of fig. 7 may be performed by the registration device 200.
At step 702, an image of a first object that needs to be traced may be captured.
At step 704, an identification code representing the first object may be generated based on the image of the first object.
At step 706, the identification code of the first object and the serial number of the first object may be sent to the cloud device in a manner corresponding to each other, so that the cloud device stores the identification code and the serial number of the first object.
The specific procedures of the method of fig. 7 may correspond to the functions described above with respect to the registration apparatus 200, and thus, for brevity of description, the specific procedures of the method of fig. 7 are not repeated herein.
Fig. 8 is a schematic flow chart diagram of a method for object tracing in accordance with some embodiments. The method of fig. 8 may be performed by the retrieval device 300.
At step 802, an image of a second object that needs to be traced may be captured.
At step 804, an identification code representing the second object may be generated based on the image of the second object.
At step 806, an identification code of the second object may be sent to the cloud device, so that the cloud device outputs a serial number corresponding to the identification code of the second object as the serial number of the second object.
The specific process of the method of fig. 8 may correspond to the specific function described above with respect to the retrieval device 300, and thus, for brevity of description, the specific process of the method of fig. 8 will not be described again here.
Fig. 9 is a schematic flow chart diagram of a method for object tracing in accordance with some embodiments. The method of fig. 9 may be performed by cloud device 400.
At step 902, an identification code and a serial number of a first object are received from a registration device and stored in a cloud database in correspondence with each other.
At step 904, an identification code of the second object is received from the retrieval device, a serial number corresponding to the identification code of the second object is retrieved from the cloud database as a serial number of the second object based on the identification code of the second object, and the serial number of the second object is output.
Although steps 902 and 904 are shown as being performed sequentially in fig. 9, in practice steps 902 and 904 may be performed in parallel or step 904 may be performed prior to step 902, which is not limited herein.
The specific process of the method of fig. 9 may correspond to the specific function described above with respect to the cloud device 400, and thus, for brevity of description, the specific process of the method of fig. 9 is not described herein in detail.
Fig. 10 is a schematic block diagram of a control work module in a registration device or a control work module in a retrieval device, according to some embodiments. For example, the control effort module 1000 shown in fig. 10 may be one exemplary implementation of the control effort module 220 in the registration device 200 or the control effort module 320 in the retrieval device 300.
As shown in fig. 10, the control work module 1000 may include a processor 1002, a memory 1004, an input interface 1006, and an output interface 1008, which may be coupled together by a bus 1010. However, it should be understood that FIG. 10 is merely illustrative and is not intended to limit the scope of the present disclosure. For example, in different application scenarios, the control work module 1000 may include more or fewer components, which are not limited herein.
The memory 1004 may be used to store various information, executable instructions or code, etc. related to controlling the functions or operations of the work module 1000. For example, memory 1004 may include, but is not limited to, random access Memory (Random Access Memory, RAM), read-Only Memory (ROM), flash Memory, programmable ROM (PROM), erasable Programmable ROM (EPROM), registers, hard disk, and so forth.
The processor 1002 may be configured to perform or implement various functions or operations that control the work module 1000. For example, in the case where the control work module 1000 is provided in the registration apparatus 200, the processor 1002 may execute executable instructions or code stored in the memory 1004, thereby implementing the various functions described above with respect to the registration apparatus 200. In the case where the control work module 1000 is provided in the retrieval device 300, the processor 1002 may execute executable instructions or code stored in the memory 1004, thereby implementing the various functions described above with respect to the retrieval device 300. The processor 1002 may include a variety of suitable processors, for example, a general purpose processor (such as a central processing unit (Central Processing Unit, CPU)), a special purpose processor (such as a digital signal processor, an application specific integrated circuit, etc.).
The input interface 1006 may receive various forms of data. For example, in a case where the control job module 1000 is provided in the registration apparatus 200, the input interface 1006 may receive an image of a first object that needs to be traced. In the case where the control work module 1000 is provided in the retrieval device 300, the input interface 1006 may receive an image of a second object that needs to be traced.
The output interface 1008 may output various forms of data. For example, in the case where the control work module 1000 is provided in the registration apparatus 200, the output interface 1008 may output an identification code and a serial number corresponding to each other. In the case where the control work module 1000 is provided in the retrieval device 300, the output interface 1008 may output an identification code. The output interface 1008 may communicate with the cloud device based on various applicable communication standards.
Fig. 11 is a schematic block diagram of a cloud device according to some embodiments. For example, cloud device 1000 may be one exemplary implementation of cloud device 400.
As shown in fig. 11, cloud device 1100 may include a processor 1102, a memory 1104, an input interface 1106, and an output interface 1108, which may be coupled together by a bus 1110. However, it should be understood that FIG. 11 is merely illustrative and is not intended to limit the scope of the present disclosure. For example, cloud device 1100 may include more or fewer components in different application scenarios, which are not limited herein.
The memory 1104 may be used to store various information, executable instructions or code, etc. related to the functions or operations of the cloud device 1100. For example, the cloud database described above may be provided in the memory 1104. For example, memory 1104 may include, but is not limited to, RAM, ROM, flash memory, PROM, EPROM, registers, hard disk, and the like.
The processor 1102 may be used to perform or implement various functions or operations of the cloud device 1100. The processor 1102 may execute executable instructions or code stored in the memory 1104 to implement the various functions described above with respect to the cloud device 400. The processor 1102 may include various suitable processors, e.g., general purpose processors (such as a CPU), special purpose processors (such as a digital signal processor, application specific integrated circuit, etc.).
Input interface 1106 can receive various forms of data. For example, the input interface 1106 may receive a serial number and an identification code of the first object from the registration device 200. The input interface 1106 may receive an identification code of the second object from the recall device 300.
The output interface 1108 may output various forms of data. For example, the output interface 1108 may output the serial number of the second object.
In some implementations, cloud device 1100 may be implemented based on open source software system Milvus. Of course, cloud device 1100 may also be implemented in other ways, which are not limited by this disclosure.
Embodiments of the present disclosure also provide a computer-readable storage medium. The computer-readable storage medium may store executable code that, when executed, implements the specific processes described above with respect to the registration device, the retrieval device, or the cloud device.
For example, computer-readable storage media may include, but are not limited to, RAM, ROM, electrically erasable programmable read-Only Memory (EEPROM), static random access Memory (Static Random Access Memory, SRAM), hard disk, flash Memory, and the like.
Embodiments of the present disclosure also provide a computer program product. The computer program product comprises a computer program. The computer program, when executed, implements the specific processes described above with respect to the registration device, the retrieval device, or the cloud device.
The foregoing has described specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
Not all the steps and units in the above-mentioned flowcharts and system configuration diagrams are necessary, and some steps or units may be omitted according to actual requirements. The apparatus structures described in the above embodiments may be physical structures or logical structures, that is, some units may be implemented by the same physical entity, some units may be implemented by multiple physical entities respectively, or may be implemented jointly by some components in multiple independent devices.
The alternative implementation of the embodiment of the present disclosure has been described in detail above with reference to the accompanying drawings, but the embodiment of the present disclosure is not limited to the specific details of the foregoing implementation, and various modifications may be made to the technical solutions of the embodiment of the present disclosure within the scope of the technical concept of the embodiment of the present disclosure, which all fall within the protection scope of the embodiment of the present disclosure.
Claims (30)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202410124157.9A CN120416321A (en) | 2024-01-30 | 2024-01-30 | Device, method and computer program product for object tracing |
| PCT/EP2025/051799 WO2025162834A1 (en) | 2024-01-30 | 2025-01-24 | Devices, methods, and computer program product for object tracing |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202410124157.9A CN120416321A (en) | 2024-01-30 | 2024-01-30 | Device, method and computer program product for object tracing |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN120416321A true CN120416321A (en) | 2025-08-01 |
Family
ID=94386153
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202410124157.9A Pending CN120416321A (en) | 2024-01-30 | 2024-01-30 | Device, method and computer program product for object tracing |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN120416321A (en) |
| WO (1) | WO2025162834A1 (en) |
Family Cites Families (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12094286B2 (en) * | 2006-09-05 | 2024-09-17 | Alpvision S.A. | Means for using microstructure of materials surface as a unique identifier |
| US8844801B2 (en) * | 2010-11-26 | 2014-09-30 | International Business Machines Corporation | Identification and trace of items within an assembly or manufacturing process |
-
2024
- 2024-01-30 CN CN202410124157.9A patent/CN120416321A/en active Pending
-
2025
- 2025-01-24 WO PCT/EP2025/051799 patent/WO2025162834A1/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025162834A1 (en) | 2025-08-07 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20190287265A1 (en) | System and method for visual identification, and system and method for classifying and sorting | |
| WO2020051959A1 (en) | Image-based costume size measurement method and device | |
| CN112100430B (en) | Article tracing method and device | |
| CN106483940A (en) | Processing meanss, production system and the processing method using processing meanss | |
| Wigger et al. | Robust and fast part traceability in a production chain exploiting inherent, individual surface patterns | |
| CN117896626A (en) | Method, device, equipment and storage medium for detecting motion trajectory with multiple cameras | |
| CN114565894A (en) | Work garment identification method and device, electronic equipment and storage medium | |
| US11068755B2 (en) | Locating method and a locator system for locating a billet in a stack of billets | |
| CN120416321A (en) | Device, method and computer program product for object tracing | |
| CN117852782A (en) | Abnormal parcel processing method and device, electronic equipment and storage medium | |
| CN116091419A (en) | Defect identification method and device based on similarity, electronic equipment and storage medium | |
| CN101697194B (en) | Data processing system and method for improving reliability of RFID application | |
| CN109447516A (en) | The asset management system | |
| US20240193929A1 (en) | Target identification method, device and computer-readable storage medium | |
| CN112529829B (en) | Training method and device for burr positioning and burr detection model | |
| CN109389000B (en) | Bar code identification method and computer applying same | |
| CN109685324A (en) | Cloth detection data processing method, system and equipment | |
| CN114155471B (en) | Design drawings and physical verification methods, devices, computer equipment and systems | |
| Chiun et al. | Object detection based automated optical inspection of printed circuit board assembly using deep learning | |
| CN109977715A (en) | Two-dimensional code identification method and two dimensional code based on outline identification | |
| Duan et al. | Zero‐Shot 3D Pose Estimation of Unseen Object by Two‐step RGB-D Fusion | |
| Fonseka et al. | Localization of component lead inside a THT solder joint for solder defects classification | |
| CN112288060A (en) | Method and apparatus for identifying a tag | |
| CN119693842B (en) | Low-temperature image identification method applied to cold chain industry | |
| CN120259629A (en) | Image processing method, device, equipment, readable storage medium and program product |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication |