CN113495162B - Control systems for automated optical inspection equipment - Google Patents
Control systems for automated optical inspection equipment Download PDFInfo
- Publication number
- CN113495162B CN113495162B CN202010200109.5A CN202010200109A CN113495162B CN 113495162 B CN113495162 B CN 113495162B CN 202010200109 A CN202010200109 A CN 202010200109A CN 113495162 B CN113495162 B CN 113495162B
- Authority
- CN
- China
- Prior art keywords
- abstract
- light source
- camera
- trigger
- control computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N35/00—Automatic analysis not limited to methods or materials provided for in any single one of groups G01N1/00 - G01N33/00; Handling materials therefor
- G01N35/00584—Control arrangements for automatic analysers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/01—Arrangements or apparatus for facilitating the optical investigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
Landscapes
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Studio Devices (AREA)
- Image Input (AREA)
Abstract
The invention relates to a control system of automatic optical detection equipment, which comprises a control computer, wherein the control computer is used for controlling one or more detection stations, each detection station can be configured with an entity light source controller, an entity camera and an entity trigger, the entity equipment of each detection station is controlled in the control computer through a main program, a light source abstract layer, a camera abstract layer and a trigger abstract layer, and the light source abstract layer, the trigger abstract layer and the camera abstract layer generate corresponding abstract objects (abstract light source controllers, abstract camera groups and abstract triggers) and have preset control rules according to the setting of a user, so that the main program can control each entity equipment through the abstract layer or provide resources required by the main program through the abstract layer.
Description
Technical Field
The present invention relates to a control system for an automatic optical inspection device, and more particularly, to a control system capable of adapting to a variation of a hardware architecture and adapting to a variation of the hardware architecture.
Background
Automatic optical inspection (Automatic optical inspection, AOI) devices are widely used in various fields, mainly relying on machine vision to replace human vision, for example, in the production line of the high-tech industry, and inspecting products for defects by measuring the appearance of the products by machine vision.
Referring to fig. 13, a conventional automatic optical inspection apparatus generally includes four hardware units, namely an image capturing unit, a control computer, a mechanism and driving motor unit, and an electric control unit. The control computer is used for controlling the operation of the whole automatic optical detection equipment, the internal software is responsible for image processing, equipment communication, action control and the like, the image capturing unit is connected with the control computer and comprises an illumination light source and a camera (CCD) for capturing the appearance of an object to be inspected, and the driving motor unit and the electric control unit execute corresponding mechanical operation according to the instruction of the control computer.
In some cases, it is possible to expand or shrink hardware in the original AOI device, for example, when the data operation performance of the original control computer is insufficient to cope with a huge amount of data, a new control computer must be added to divide the work of the processing portion.
For example, referring to fig. 14, it is assumed that the original hardware architecture is to control the first inspection station and the second inspection station by a single control computer, wherein the first inspection station includes a light source controller, a trigger and a first camera group, and the second inspection station includes a light source controller, a trigger and a second camera group, and the number of cameras in each camera group can be different from each other. When the control computer receives a hardware in-place signal, the current of the light source controller of the relevant detection station is set, and the trigger synchronously triggers the light source and the camera. The images shot by the first camera group and the second camera group are transmitted to a control computer for subsequent analysis.
Fig. 15 is a diagram of a new control computer based on the architecture of fig. 14, and a part of camera transfer is conducted by the new control computer, and the original control computer is responsible for controlling the first inspection station and receiving the images captured by the first camera group, and the new control computer is responsible for controlling the second inspection station and receiving the images captured by the second camera group.
The programmer must directly modify the original code to conform to the new architecture during the above-mentioned architecture changes. The portions that may involve modification of the original program code include:
and the camera adjusts the corresponding program codes aiming at camera hardware to finish the image data required by the image processing program.
The light source controller is used for controlling the illumination light source. When two control computers are amplified from a single control computer, a programmer must modify the program code of the original light source controller for the light source controllers of different inspection stations, so that each control computer corresponds to its own light source controller.
The function of the triggering interface is to start each camera in the corresponding camera group to shoot images, and each triggering interface is provided with a plurality of control channels (channels) which can be respectively connected with a plurality of cameras. Because the control right of the camera group is transferred to a new control computer, the cameras respectively responsible for control of different trigger interfaces also change, and the program code must be modified according to the actual conditions of each control channel.
Image processing procedure because the original two-phase groups are respectively responsible for different control computers, the image processing procedure in the control computer also needs to be modified along with the new hardware architecture.
As can be appreciated from the foregoing examples, when a hardware device changes, the programmer needs to modify from the original program code to conform to the new architecture, and is therefore quite time consuming and inefficient.
Disclosure of Invention
The main object of the present invention is to provide a more flexible control system for an automatic optical inspection apparatus, which can quickly adapt to new hardware configuration without changing the original program code in the control computer when the light source controller, camera or trigger in the automatic optical inspection apparatus needs to be changed.
The invention relates to a control system of automatic optical detection equipment, which is used for controlling at least one detection station of the automatic optical detection equipment, wherein each detection station comprises a physical light source controller, a physical camera and a physical trigger, and the control system comprises:
A control computer, comprising:
A main program for providing a user interface for user to input setting parameters, wherein the main program comprises a global control module and at least one main control module;
A light source abstract layer for generating at least one abstract light source controller according to the set parameters of the user, the abstract light source controller is controlled by the main control module, and the light source abstract layer is used for controlling the entity light source controller or establishing a software light source;
a camera abstract layer for generating at least one abstract camera group according to the setting parameters of the user, the abstract camera group is controlled by the main control module, and the camera abstract layer is used for controlling the entity camera or establishing a software camera;
A trigger abstract layer for generating at least one abstract trigger according to the setting parameters of the user, the abstract trigger is controlled by the main control module, and the trigger abstract layer is used for controlling the entity trigger or establishing a software trigger.
The light source abstraction layer, the trigger abstraction layer and the camera abstraction layer generate corresponding abstract objects (abstract light source controllers, abstract camera groups and abstract triggers) according to the setting of a user, detect whether hardware such as the light source controllers, the triggers and the cameras of an entity exist or not, determine whether software virtualization is needed according to the hardware configuration of the entity, namely, control an actual hardware device through the abstraction layer when detecting that the actual hardware exists, and virtualize the function of the hardware through the abstraction layer when detecting that no actual hardware exists.
For the main program, only a control instruction is required to be sent to the abstract object, the entity control is achieved through each abstract layer, and the entity device is not required to be directly controlled. Therefore, when the hardware device is changed, the burden of modifying the program can be reduced.
Drawings
FIG. 1 is a system architecture diagram of the present invention;
FIG. 2A is a schematic diagram of the abstraction of two 4-channel physical light source controllers into a single 8-channel abstract light source controller using a light source abstraction layer according to the present invention;
FIG. 2B is a schematic diagram of an abstract light source controller of an 8-channel entity using a light source abstraction layer to abstract two abstract light source controllers of 4-channel entity according to the present invention;
FIG. 3A is a schematic diagram of the present invention using a camera abstraction layer to respectively use image data captured by different camera groups as image data sources of different inspection stations;
FIG. 3B is a schematic diagram of the present invention using a camera abstraction layer to take image data captured by the same camera group as image data sources of different inspection stations;
FIG. 4A is a diagram illustrating the implementation of hardware triggers using a trigger abstraction layer in accordance with the present invention;
FIG. 4B is a schematic diagram of the present invention for executing software triggers using a trigger abstraction layer;
FIG. 5 is a schematic diagram of the global control module and the main control module in a single control computer according to the present invention;
FIG. 6 is a control flow diagram of the main control module of the present invention;
FIG. 7 is a schematic diagram of instruction conversion of the abstract layer control actual hardware;
FIG. 8 is a schematic diagram of an abstract layer receiving signals from actual hardware;
FIG. 9 is a schematic diagram of the global control module and the main control module in two control computers according to the present invention;
fig. 10A to 10C are schematic views of a camera group splitting process according to the present invention;
FIGS. 11A-11D are schematic diagrams illustrating a trigger splitting process according to the present invention;
FIGS. 12A-12D are schematic diagrams illustrating a light source controller splitting process according to the present invention;
FIG. 13 is a schematic diagram of the composition of an Automated Optical Inspection (AOI) system;
FIG. 14 is a schematic diagram of an AOI system with a single control computer controlling two inspection stations;
FIG. 15 is a schematic diagram of an AOI system with two control computers controlling two inspection stations, respectively.
Detailed Description
The invention aims at three main devices in an automatic optical inspection (Automatic optical inspection, AOI) system, namely a camera, a trigger (trigger interface) and a light source controller, and respectively establishes respective hardware abstraction layers (Hardware abstraction layer, HAL), namely a camera abstraction layer, a trigger abstraction layer and a light source abstraction layer, wherein the abstraction layers can simplify modification operation through the abstraction layers when the AOI system needs to change hardware.
The light source abstraction layer, the trigger abstraction layer and the camera abstraction layer can detect whether the hardware such as the light source controller, the trigger and the camera of the entity exist according to the setting of the user, so as to determine whether software virtualization is needed according to the hardware configuration of the entity, that is, when the existence of the actual hardware is detected, the actual hardware device can be controlled through the abstraction layer, because the light source controller, the trigger or the camera of the entity all have unique identification codes (ID (for example, serial number S.N. of the machine, IP of the machine, etc.), whether the actual hardware exists or not can be judged according to the identification codes, the identification codes are usually the addressing mode provided by a hardware manufacturer, when the existence of the actual hardware is detected, the function of the hardware can be virtualized through the abstraction layer, and the control and the function of each abstraction layer will be further described in detail later.
Referring to fig. 1, a control computer 1 in an AOI system can control different hardware, which mainly includes a physical light source controller 11, a physical camera 12 and a physical trigger 13, wherein the physical trigger 13 is used for triggering the physical light source controller 11 and the physical camera 12 simultaneously, and the number of various devices shown in fig. 1 is merely for illustration.
The software control system is installed in the control computer 1, and the software control system includes a main program 20, a light source abstraction layer 30, a camera abstraction layer 40 and a trigger abstraction layer 50, wherein the light source abstraction layer 30, the camera abstraction layer 40 and the trigger abstraction layer 50 can be independent programs which communicate through standard protocols or are integrated into the same execution file with the main program 10.
Each control computer 1 has a main program 20, and the main program 20 includes a global control module 21 and a number of main control modules 22 corresponding to the inspection stations, for example, the control computer 1 is used to control two inspection stations, and the main program 20 has two main control modules 22. The main program 10 provides a visual interface for the user to set and display the current state of the system, and the user can set the camera groups required by each detection station and the configuration of the number of cameras, triggers, light source controllers and the like contained in each camera group through the interface.
The light source abstraction layer 30 can establish an abstract light source controller 31 according to the setting parameters (i.e. the setting table of the light source abstraction layer 30) input by the user from the use interface, and the light source abstraction layer 30 and the main control module 22 mutually transmit instructions through a predefined communication protocol (protocol). The light source abstraction layer 30 determines whether the physical light source controller 11 exists, the number of physical light source controllers 11, etc., and determines whether the physical light source controller 11 should be directly controlled or a software light source should be virtually generated. If there is a physical light source controller 11 for control, as in the example shown in fig. 2A, the light source abstraction layer 30 can abstract two 4-channel physical light source controllers 11 into a single 8-channel abstract light source controller 31, i.e. one-to-many abstract, wherein each 4-channel physical light source controller 11 can be triggered to output driving current to activate the physical illumination light source (e.g. LED light source) connected to each channel. As shown in the example of fig. 2B, the light source abstraction layer 30 can abstract an 8-channel entity light source controller 11 into two 4-channel abstract light source controllers 31, i.e. many-to-one abstract. If the light source abstraction layer 30 determines that there is no real physical light source controller 11, the light source abstraction layer 30 will virtually generate a software light source, but the software light source basically has no effect.
As another example, if the user desires to construct a "12-channel abstract light source controller 31", the light source abstract layer 30 may be constructed in any of the following ways, but regardless of the way it is constructed, the abstract light source controller 31 is considered to be 12-channel for the main control module 22:
(1) A 12-channel physical light source controller is selected and selected by selecting the identification code of the 12-channel physical light source controller.
(2) 4 Channels and 8 channels, namely combining a 4-channel physical light source controller and an 8-channel physical light source controller, wherein the 4-channel physical light source controller and the 8-channel physical light source controller can be combined by selecting the identification code of the 4-channel physical light source controller and the identification code of the 8-channel physical light source controller.
(3) 4 Channels +4 channels by combining three 4 channel physical light source controllers, i.e. by selecting the identification codes of the three 4 channel physical light source controllers.
The camera abstraction layer 40 can establish the number of abstract camera groups 41A, 41B according to the setting input by the user at the user interface, and specify the cameras to be included in each abstract camera group 41A, 41B, and when the cameras are specified, each camera has a vendor name and a hardware serial number, so that the vendor name and the hardware serial number can be used as the identification information of each camera. The camera abstraction layer 40 determines whether the physical camera 12 and the number thereof are present, and determines whether to directly control the physical camera 12 or to virtualize a software camera according to the determination result. Similarly, the cameras and the main control module 22 communicate with each other and transmit commands via a predefined protocol (protocol). As shown in the examples of fig. 1 and 3A, the user wants to create an abstract camera group 41A and an abstract camera group 41B, wherein one abstract camera group 41A is designated to include two cameras, and the other abstract camera group 41B is designated to have one camera. If the camera abstraction layer 40 detects that there are two entity cameras 12 specified by the abstract camera group 41A, image data generated when two entity cameras 12 actually shoot is provided to the main program 10, otherwise, when the camera abstraction layer 40 does not detect two entity cameras 12 specified by the abstract camera group 41A, two software cameras are virtualized, that is, image data is read from a database (such as a hard disk) and provided to the main program 10, and similarly, the camera abstraction layer 40 also determines whether the cameras in the other abstract camera group 41B have corresponding entity cameras 12, and if not, one software camera is virtualized.
In fig. 3A, the camera abstraction layer 40 may use the image data of different abstract camera groups 41A and 41B as the image data of different inspection stations, that is, the image data of the first inspection station may be obtained from the abstract camera group 41A, and the image data of the second inspection station may be obtained from the abstract camera group 41B. In other embodiments, as shown in fig. 3B, the camera abstraction layer 40 may use the image data of a single abstract camera group 41 as the image data of multiple inspection stations, that is, different inspection stations may use the image data of the same abstract camera group 41.
The image data generated after the camera shoots can be divided into synchronous images and asynchronous images according to the generation mode of the images. The definition of the synchronous image refers to image data obtained after shooting by different cameras at the same time sequence. Definition of asynchronous images refers to groups of images generated by the same camera group at different time sequences. As shown in the table below,
Taking (1, 2) in the table above as an example, 2 images can be obtained by taking two images from one camera. Taking (2, 2) as an example, two cameras are represented to take images twice, so that 4 images can be obtained.
The trigger abstraction layer 50 can create an abstract trigger 51 according to the setting parameters input by the user from the use interface. The trigger abstraction layer 50 determines whether the physical trigger 13 exists or not to determine whether to control the physical trigger 13 or to virtualize a software trigger. Referring to fig. 4A, taking the 2-channel physical trigger 13 as an example, after the trigger abstraction layer 50 receives an in-place signal, two channels of the physical trigger 13 can be controlled to synchronously send out trigger signals to control the camera or the light source. Referring to fig. 4B, the trigger abstraction layer 50 may also send a trigger signal directly after receiving the bit signal to directly control the camera or the light source, i.e. to complete the triggering in the form of a software trigger.
Referring to fig. 5, the global control module 21 in the control computer 1 is configured to designate the inspection station to which each of the main control modules 22A and 22B should be responsible, and each of the main control modules 22A and 22B respectively controls an abstract light source controller 31A, an abstract light source controller 31B, an abstract camera group 41A, an abstract camera group 41B, an abstract trigger 51A and an abstract trigger 51B corresponding to the inspection station. For example, the control computer 1 assigns the main control module 22A to be responsible for the first inspection station, and after another main control module 22B is responsible for the second inspection station, the main control module 22A controls the abstract light source controller 31A, the abstract camera group 41A, and the abstract trigger 51A corresponding to the first inspection station, and the other main control module 22B controls the abstract light source controller 31B, the abstract camera group 41B, and the abstract trigger 51B corresponding to the second inspection station. For each of the master control modules 22A, 22B, what is controlled is not a real physical device, but an abstract object. The entity devices are controlled by the corresponding abstraction layers. However, the same light source abstraction layer 30 is shared by the abstract light source controllers 31A and 31B of different inspection stations, and the same is shared by the camera abstraction layer 40 and the trigger abstraction layer 50.
Referring to fig. 6, a control flow diagram of a single main control module 22 is shown, and the general flow is that when the main control module 22 receives an in-place signal (S1), the main control module 22 sets a light source and a trigger situation to be controlled according to the detection flow (S2), waits for triggering (S3), and shoots to generate an image after the camera receives the triggering, and the main control module 22 finally receives image data (S4). The image detection process in the main control module 22 analyzes the received image detection process.
The main control module 22 pre-establishes general instructions for different abstract layers, for example, the general instructions for controlling the light source are pre-established for the light source abstract layer 30, the general instructions for controlling the camera are pre-established for the camera abstract layer 40, and the general instructions for triggering the abstract layer 50 are also related general instructions for controlling the abstract objects corresponding thereto. In the present invention, different translation rules are pre-established in each abstract layer, and when each abstract layer receives a general instruction, each general instruction is translated into an actual hardware instruction, and the actual hardware instruction is output to control the entity device, i.e. the entity light source controller 11, the entity camera 12 or the entity trigger 13. The actual hardware instruction is typically API (Application programming interface) instructions provided by the hardware device manufacturer, so that translation rules are set according to the specifications of different hardware device manufacturers. Even for the same general instruction, the actual hardware devices of different manufacturers will have corresponding actual hardware instructions. For example, the general command sent by the main control module 22 is "camera image capturing at a position a", if the camera abstract layer 40 compares the specification, model and manufacturer of the actual hardware device to be the first manufacturer, the camera abstract layer 40 translates the general command into the actual hardware command conforming to the first manufacturer, and if the camera abstract layer 40 compares the actual hardware device to be provided by the second manufacturer, the camera abstract layer 40 translates the general command into the actual hardware command conforming to the second manufacturer.
Thus, the generic instructions are widely applicable to different hardware device manufacturers and the main control module 22 need only be responsible for issuing pre-established high-level commands. The low-level commands to control the actual hardware devices are built up by the abstraction layer. The programmer only needs to preset common general instructions and how to adapt to translation rules of different hardware device manufacturers, and each abstract layer can control the actual hardware device.
Referring to fig. 7, it is further described how control of the physical devices is achieved by the master control module 22 through various abstraction layers. Taking the light source abstraction layer 30 as an example, assume that the general instruction sent by the main control module 22 is to control a 12-channel light source (i.e. the abstract light source controller is 12 channels), and after the general instruction is received by the light source abstraction layer 30, it is determined whether to split the instruction according to the setting table of the light source abstraction layer 30 itself.
Taking the situation above fig. 7 as an example, if the 12-channel abstract light source controller is configured by combining an 8-channel light source controller and a 4-channel light source controller according to a configuration table, when the light source abstract layer 30 receives a general command from the main control module 22, the light source abstract layer 30 splits the general command to two software sub-controllers, and respectively translates two sets of actual hardware commands to respectively control the 8-channel physical light source controller and the 4-channel physical light source controller.
Taking the situation below in fig. 7 as an example, if the 12-channel abstract light source controller is implemented by a 12-channel physical light source controller according to a configuration table, when the light source abstract layer 30 receives a general-purpose instruction from the main control module 22, the light source abstract layer 30 translates the general-purpose instruction into an actual hardware instruction to control the 12-channel physical light source controller.
Referring to FIG. 8, the reply from the physical device to the host control module 22 occurs only with the physical camera, and is therefore illustrated by the camera abstraction layer 40. The entity camera outputs a completion signal when capturing is completed, and for the abstract camera group 41A, since the abstract camera group 41A has been previously set to correspond to two synchronized entity cameras 12A and 12B, the abstract camera group 41A will determine whether the two synchronized entity cameras 12A and 12B have both completed capturing and capturing, and if capturing is completed, the abstract camera group 41A will call back (call back) the main control module 22 to notify the main control module that capturing is completed. For another abstract camera group 41B, because it is set to correspond to one physical camera 12, the abstract camera group 41B will determine whether the physical camera 12 is performing the capturing, and if so, the abstract camera group 41B will respond to the main control module 22.
Referring to fig. 9, two control computers 1A and 1B are taken as an example, and each control computer 1A and 1B has a global control module 21A and 21B. The control computer 1A sets the main control module 22A to be responsible for the first inspection station, and the main control module 22A controls the abstract light source controller 31A, the abstract camera group 41A, and the abstract trigger 51A corresponding to the first inspection station. The other control computer 1B sets the main control module 22B to be responsible for the second inspection station, and controls the abstract light source controller 31B, the abstract camera group 41B, and the abstract trigger 51B corresponding to the second inspection station.
The following further describes the flow of the present invention, which is set correspondingly according to the hardware configuration change when the hardware configuration of the AOI system changes:
Referring to fig. 10A, it is assumed that two main control modules 22A and 22B respectively correspond to two inspection stations in a control computer 1A, wherein the main control module 22A controls an abstract camera group 41A, and the abstract camera group 41A corresponds to two physical cameras 121 and 122. The other main control module 22B controls the other abstract camera group 41B, which abstract camera group 41B corresponds to one entity camera 123. When another control computer 1B is newly added, the user changes the physical camera 123 originally connected to the control computer 1A to be connected to the newly added control computer 1B, completes the change of the hardware circuit, removes the abstract camera group 41B by using the interface with the main program 20 of the original control computer 1A, newly adds an abstract camera group 41B in the new control computer 1B through the use interface with the main program, and designates the constituent members of the abstract camera group 41B as the physical camera 123. Referring to fig. 10B and 10C, the main control module 22B in the original control computer 1A is moved to the new control computer 1B, and then the global control module 21B in the control computer 1B sets that the main control module 22B is responsible for controlling the abstract camera group 41B, so as to complete the splitting process of the camera group, and move the abstract camera group 41B to be dominated by the new control computer 1B.
Referring to fig. 11A, there are two main control modules 22A and 22B in a control computer 1A corresponding to two detection stations, wherein the main control module 22A controls an abstract trigger 51A, the abstract trigger 51A is used for controlling a first channel and a second channel on the entity trigger 13A, the first channel and the second channel can be respectively connected with a light source or a camera, and the other main control module 22B controls the other abstract trigger 51B, and the abstract trigger 51B is used for controlling a third channel on the entity trigger 13A. Referring to fig. 11B, after another control computer 1B is added, an entity trigger 13B is installed on the control computer 1B, and the device (such as a camera or an illumination light source) controlled by the third channel is changed to be connected to the entity trigger 13B, so as to complete the hardware change. Referring to fig. 11C, the abstract trigger 51B is removed by using the interface of the main program 20 of the original control computer 1A, the main control module 22B in the original control computer 1A is moved to the new control computer 1B, and an abstract trigger 51B is newly added in the new control computer 1B through the use interface of the main program. Referring to fig. 11D, the global control module 21B in the control computer 1B sets that the main control module 22B is responsible for controlling the abstract trigger 51B, so that the process of splitting the trigger is completed, and the abstract trigger 51B is transferred to be dominated by the newly added control computer 1B.
Referring to fig. 12A, in a control computer 1A, there are two main control modules 22A and 22B corresponding to two inspection stations respectively, wherein the main control module 22A controls one 12-channel abstract light source controller 31A, the abstract light source controller 31A is used for controlling two entity light source controllers 11A and 11B, the other main control module 22B controls one 8-channel abstract light source controller 31B, and the abstract light source controller 31B is used for controlling one entity light source controller 11C. In fig. 12B, when another control computer 1B is added, the original physical light source controller 11C is connected to the control computer 1B instead, and the hardware change is completed. Referring to fig. 12C, the abstract light source controller 31B is removed by using the main program 20 of the original control computer 1A using the interface, and a user sets a desired light source according to the situation by using the interface of the main program in the new control computer 1B to add an abstract light source controller 31B. Referring to fig. 12D, the main control module 22B in the original control computer 1A is moved to the new control computer 1B, and then the global control module 21B in the control computer 1B sets that the main control module 22B is responsible for controlling the abstract light source controller 31B, so as to complete the splitting process of the light source controller, and move the abstract light source controller 31B to be dominated by the new control computer 1B.
According to the above-mentioned splitting process of the light source controller, the camera group and the trigger controller, the light source abstract layer 30, the camera abstract layer 40 and the trigger abstract layer 50 which are commonly used in different control computers are constructed in advance, so that the requirement of a programmer for directly editing the original program codes is greatly reduced. Although the above example uses a newly added control computer as an example, the present invention can integrate the architecture that uses multiple control computers to control different inspection stations separately into a system that uses a single control computer to control multiple inspection stations, and the user only needs to set the system according to the actual hardware environment, and the control computer can control the actual hardware through the light source abstraction layer 30, the camera abstraction layer 40 and the trigger abstraction layer 50, so that the overall system is more flexible in design and maintenance.
Claims (8)
1. A control system for an automated optical inspection apparatus, wherein at least one inspection station for controlling the automated optical inspection apparatus, each inspection station comprises a physical hardware configuration including a physical light source controller, a physical camera, and a physical trigger, the control system comprising:
A first control computer comprising:
the system comprises a main program, a plurality of detection stations and a control module, wherein the main program provides a use interface for a user to input setting parameters, and comprises a global control module and a plurality of main control modules, and the global control module is used for assigning a corresponding detection station controlled by each main control module;
a light source abstract layer for generating at least one abstract light source controller according to the setting parameters of the user, wherein the abstract light source controller is controlled by the plurality of main control modules, the light source abstract layer is used for controlling the entity light source controller or establishing a software light source;
A camera abstract layer, which generates at least one abstract camera group according to the setting parameters of the user, wherein the abstract camera group is controlled by the plurality of main control modules, the camera abstract layer is used for controlling the entity camera or establishing a software camera, when the camera abstract layer judges that the entity camera does not exist in the detection station, the camera abstract layer takes out image data from a database to be used as the software camera;
A trigger abstract layer for generating at least one abstract trigger according to the setting parameters of the user, wherein the abstract trigger is controlled by the plurality of main control modules, the trigger abstract layer is used for controlling the entity trigger or establishing a software trigger;
the main control modules share the light source abstraction layer, the camera abstraction layer and the trigger abstraction layer and are responsible for controlling corresponding detection stations;
When the entity camera of the first control computer is to be transferred to a second control computer, the entity camera to be transferred is changed to be connected to the second control computer, and a corresponding abstract camera group is newly added in the second control computer; one of the main control modules of the first control computer is moved to the second control computer, and the global control module of the second control computer sets that the moved main control module is responsible for controlling the newly added abstract camera group, wherein the main control module moved to the second control computer corresponds to the entity camera moved from the first control computer to the second control computer;
when the entity light source controller of the first control computer is to be transferred to the second control computer, the entity light source controller to be transferred is changed to be connected to the second control computer, and a corresponding abstract light source controller is newly added in the second control computer; one of the main control modules of the first control computer is moved to the second control computer, and the global control module of the second control computer sets that the moved main control module is responsible for controlling the newly added abstract light source controller, wherein the main control module moved to the second control computer corresponds to the entity light source controller moved from the first control computer to the second control computer;
When the abstract trigger of the first control computer is to be transferred to the second control computer for execution, installing an entity trigger in the second control computer, wherein one of the main control modules of the first control computer is moved to the second control computer, and the abstract trigger is newly added in the second control computer, and the global control module of the second control computer sets the moved main control module to be responsible for controlling the newly added abstract trigger, wherein the main control module moved to the second control computer corresponds to the abstract trigger transferred from the first control computer to the second control computer.
2. The control system of automated optical inspection equipment of claim 1, wherein each of the master control modules is responsible for processing image data generated by a corresponding inspection station.
3. The control system of an automated optical inspection apparatus of claim 2, wherein the light source abstraction layer controls data conversion between the abstract light source controller and a physical light source controller.
4. The control system of an automated optical inspection apparatus of claim 2, wherein the software trigger is a trigger signal generated using the trigger abstraction layer.
5. The control system of an automated optical inspection apparatus of claim 2, wherein the physical trigger is a trigger interface having a plurality of trigger channels.
6. The control system of an automatic optical inspection device according to any one of claims 1 to 5, wherein different general instructions corresponding to the light source abstraction layer, the camera abstraction layer, and the trigger abstraction layer are preset in each of the main control modules, and translation rules for translating the general instructions into actual hardware instructions are preset in each of the light source abstraction layer, the camera abstraction layer, and the trigger abstraction layer.
7. The control system of the automatic optical inspection device according to claim 6, wherein the actual hardware instructions translated by the light source abstraction layer, the camera abstraction layer, and the trigger abstraction layer are used to control the physical light source controller, the physical camera, and the physical trigger, respectively;
After receiving the general instructions sent by the main control modules, the light source abstraction layer, the camera abstraction layer and the trigger abstraction layer compare the actual hardware configuration of the detection station, and establish the actual hardware instructions according to the comparison result and the corresponding translation rules.
8. The automated optical inspection equipment control system of claim 6, wherein the light source abstraction layer, the camera abstraction layer, and the trigger abstraction layer are compared to a hardware specification, model, manufacturer of the inspection station when comparing the actual hardware configuration of the inspection station.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010200109.5A CN113495162B (en) | 2020-03-20 | 2020-03-20 | Control systems for automated optical inspection equipment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010200109.5A CN113495162B (en) | 2020-03-20 | 2020-03-20 | Control systems for automated optical inspection equipment |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN113495162A CN113495162A (en) | 2021-10-12 |
| CN113495162B true CN113495162B (en) | 2025-04-01 |
Family
ID=77993629
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010200109.5A Active CN113495162B (en) | 2020-03-20 | 2020-03-20 | Control systems for automated optical inspection equipment |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN113495162B (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP4325311A1 (en) | 2022-07-06 | 2024-02-21 | Contemporary Amperex Technology Co., Limited | Debugging method and apparatus for production line devices, and production line system |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101512359A (en) * | 2006-07-10 | 2009-08-19 | 阿斯特瑞昂公司 | System and method for performing processing in a testing system |
| CN203287332U (en) * | 2013-05-17 | 2013-11-13 | 深圳明锐理想科技有限公司 | Immovable automatic optical check system |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20040025173A1 (en) * | 2002-04-24 | 2004-02-05 | Gil Levonai | Interaction abstraction system and method |
| JP2011221803A (en) * | 2010-04-09 | 2011-11-04 | Toyota Motor Corp | Test tool and test method |
| CN107449778B (en) * | 2016-05-31 | 2018-11-23 | 上海微电子装备(集团)股份有限公司 | A kind of automatic optical detection device and method |
| US10705511B2 (en) * | 2018-07-11 | 2020-07-07 | Siemens Aktiengesellschaft | Abstraction layers for automation applications |
-
2020
- 2020-03-20 CN CN202010200109.5A patent/CN113495162B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101512359A (en) * | 2006-07-10 | 2009-08-19 | 阿斯特瑞昂公司 | System and method for performing processing in a testing system |
| CN203287332U (en) * | 2013-05-17 | 2013-11-13 | 深圳明锐理想科技有限公司 | Immovable automatic optical check system |
Also Published As
| Publication number | Publication date |
|---|---|
| CN113495162A (en) | 2021-10-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN106897688B (en) | Interactive projection apparatus, method of controlling interactive projection, and readable storage medium | |
| JP7216190B2 (en) | Modular Acceleration Module for Programmable Logic Controller Based Artificial Intelligence | |
| EP2843556B1 (en) | Simulator, simulation method, and simulation program | |
| US20160246278A1 (en) | Simulation system, programmable controller, simulation device, and engineering tool | |
| CN109426353B (en) | System module for customizing display frame in non-invasive data acquisition system | |
| AU2014217524A1 (en) | Flexible room controls | |
| US11209790B2 (en) | Actuator control system, actuator control method, information processing program, and storage medium | |
| US12346108B2 (en) | Method for remote assistance and device | |
| CN113495162B (en) | Control systems for automated optical inspection equipment | |
| CN105807730B (en) | A kind of digital equipment industrial control method, apparatus and system | |
| US10474124B2 (en) | Image processing system, image processing device, method of reconfiguring circuit in FPGA, and program for reconfiguring circuit in FPGA | |
| US20170050319A1 (en) | Programmable Machine Vision Device | |
| US20210097341A1 (en) | Systems and methods for training image detection systems for augmented and mixed reality applications | |
| CN112148241B (en) | Light processing method, device, computing equipment and storage medium | |
| CN113377583B (en) | Display screen controller backup method, device and system | |
| TWI825289B (en) | Control system for automatic optical inspection equipment | |
| EP3745332B1 (en) | Systems, device and method of managing a building automation environment | |
| JP2008249907A (en) | Projector and display system | |
| CN112233208B (en) | Robot state processing method, apparatus, computing device and storage medium | |
| CN110764841B (en) | 3D visual application development platform and development method | |
| CN114900533A (en) | Dimming data synchronization method, system, equipment and storage medium based on Ethernet | |
| CN111767075A (en) | Method and device for synchronizing application program versions | |
| WO2013081304A1 (en) | Image conversion apparatus and method for converting two-dimensional image to three-dimensional image, and recording medium for same | |
| JP7067869B2 (en) | Image processing systems, information processing equipment, information processing methods, and information processing programs | |
| Plewiński et al. | Remote control of 3D camera rig with embedded system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant | ||
| TG01 | Patent term adjustment | ||
| TG01 | Patent term adjustment |