[go: up one dir, main page]

WO2018179421A1 - Système informatique, procédé de diagnostic d'objet artificiel et programme - Google Patents

Système informatique, procédé de diagnostic d'objet artificiel et programme Download PDF

Info

Publication number
WO2018179421A1
WO2018179421A1 PCT/JP2017/013821 JP2017013821W WO2018179421A1 WO 2018179421 A1 WO2018179421 A1 WO 2018179421A1 JP 2017013821 W JP2017013821 W JP 2017013821W WO 2018179421 A1 WO2018179421 A1 WO 2018179421A1
Authority
WO
WIPO (PCT)
Prior art keywords
artifact
image
image analysis
acquired
computer system
Prior art date
Application number
PCT/JP2017/013821
Other languages
English (en)
Japanese (ja)
Inventor
俊二 菅谷
佳雄 奥村
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to PCT/JP2017/013821 priority Critical patent/WO2018179421A1/fr
Publication of WO2018179421A1 publication Critical patent/WO2018179421A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to a computer system, an artifact diagnosis method, and a program for diagnosing an artifact.
  • thermo image diagnosis it is possible for an operator to know the temperature distribution without contacting facilities or equipment (see Non-Patent Document 1). Moreover, it is also possible for an operator to grasp the damage or discoloration of facilities or devices by monitoring the visible light image.
  • Non-Patent Document 1 or 2 an abnormality such as a facility or device is determined based only on acquired image data or non-image data, and diagnosis is performed by analyzing a plurality of image data. It was not a thing.
  • the present invention provides the following solutions.
  • the present invention is a computer system for diagnosing an artifact, First image acquisition means for acquiring a plurality of first artifact images accompanied by a time-series change of the artifact; First image analysis means for image analysis of the acquired first artifact image; Second image acquisition means for acquiring a plurality of second artifact images accompanied by a time-series change of another artifact in the past; Second image analysis means for image analysis of the acquired second artifact image; Collating means for collating the result of image analysis of the first artifact image with the result of image analysis of the second artifact image; Diagnostic means for diagnosing the artifact based on the collated result; A computer system is provided.
  • a computer system for diagnosing an artifact obtains a plurality of first artifact images accompanied by a time-series change of the artifact, and performs image analysis on the obtained first artifact image, Acquiring a plurality of second artifact images accompanied by a time-series change of another artifact in the past, performing image analysis on the acquired second artifact image, and a result of image analysis of the first artifact image; The result of image analysis of the second artifact image is collated, and the artifact is diagnosed based on the collation result.
  • the present invention is a computer system category, but also in other categories such as an artifact diagnosis method and program, the same actions and effects according to the category are exhibited.
  • the present invention it is possible to provide a computer system, an artifact diagnosis method, and a program in which a plurality of time-series image data are combined and diagnosis accuracy is further improved as compared with diagnosis based on conventional single image analysis. It becomes.
  • FIG. 1 is a diagram showing an outline of the artifact diagnosis system 1.
  • FIG. 2 is an overall configuration diagram of the artifact diagnosis system 1.
  • FIG. 3 is a functional block diagram of the computer 10.
  • FIG. 4 is a flowchart showing a learning process executed by the computer 10.
  • FIG. 5 is a flowchart showing an artifact diagnosis process executed by the computer 10.
  • FIG. 6 is a diagram illustrating a first artifact image and a second artifact image that the computer 10 collates.
  • FIG. 1 is a diagram for explaining an outline of an artifact diagnosis system 1 which is a preferred embodiment of the present invention.
  • the artifact diagnosis system 1 includes a computer 10 and is a computer system that diagnoses artifacts.
  • the artifacts diagnosed by the artifact diagnosis system 1 are, for example, pipes, roads and bridges, buildings, and arbitrary artifacts (vehicles, air conditioners, home appliances, information processing devices, etc.).
  • the artifact diagnosis system 1 acquires an image of a pipe, performs image analysis on a marked (enclosed) image with respect to the pipe image, and diagnoses a pipe crack. To do.
  • the computer 10 is connected to various imaging devices such as an infrared camera, visible light camera, X-ray camera, and ultrasonic camera (not shown), and various devices that store or measure environmental data such as internal flow rate, temperature, and humidity. Is a computing device.
  • imaging devices such as an infrared camera, visible light camera, X-ray camera, and ultrasonic camera (not shown), and various devices that store or measure environmental data such as internal flow rate, temperature, and humidity.
  • the computer 10 acquires a plurality of first artifact images accompanied by a time-series change of the artifact (step S01).
  • the computer 10 acquires any one or a combination of X-ray images, infrared images, ultrasonic images, or visible light images as the first artifact image.
  • the computer 10 obtains the above-described first artifact image captured by the above-described various imaging devices.
  • the first artifact image is not limited to the image described above, and may be other images.
  • the computer 10 performs image analysis on the acquired first artifact image (step S02).
  • the computer 10 performs image analysis by analyzing either or both of the feature point and the feature amount of the first artifact image.
  • a feature point is something reflected in an image, specifically, shape, color, brightness, outline, and the like.
  • the feature amount is a statistical numerical value such as various numerical values (average of pixel values, variance, histogram) calculated from image data.
  • the computer 10 performs machine learning in advance on one or both of feature points and feature amounts of a second artifact image, which will be described later, as teacher data, and performs image analysis on the first artifact image based on the learning result. Good. Further, the computer 10 may perform image analysis on an image marked (enclosed) with respect to the first artifact image by a terminal device (not shown) or the like. The mark means to enclose each specific part of the image.
  • the computer 10 acquires a plurality of second artifact images accompanied by a time-series change of another artifact in the past (step S03).
  • the computer 10 acquires the second artifact image from another computer or database not shown. At this time, the computer 10 acquires one or a plurality of second artifact images.
  • the computer 10 performs image analysis on the acquired second artifact image (step S04).
  • the computer 10 performs image analysis by analyzing either or both of the feature point and the feature amount of the second artifact image.
  • the computer 10 executes image analysis similar to the image analysis of the first artifact image described above. That is, when the computer 10 analyzes the feature point with respect to the first artifact image, the computer 10 analyzes the feature point of the second artifact image, and analyzes the feature amount with respect to the first artifact image.
  • the feature amount of the second artifact image is analyzed and the feature point and feature amount are analyzed for the first artifact image, the feature point and feature amount of the second artifact image are analyzed.
  • the computer 10 may perform image analysis on the marked image with respect to the second artifact image by a terminal device (not shown) or the like.
  • the computer 10 collates the image analysis result of the first artifact image with the image analysis result of the second artifact image (step S05).
  • the computer 10 collates either or both of the feature point or feature amount analyzed from the first artifact image with either or both of the feature point or feature amount analyzed from the second artifact image.
  • the computer 10 diagnoses the artifact based on the collation result (step S06). For example, the computer 10 calculates the similarity between the first artifact image and the second artifact image based on the collation result, and diagnoses the artifact.
  • the computer 10 may diagnose a risk related to a defect such as a crack of an artifact based on the collation result.
  • the risk related to defects indicates, for example, a percentage of the occurrence rate of defects in an artificial object that is diagnosed.
  • FIG. 2 is a diagram showing a system configuration of the artifact diagnosis system 1 which is a preferred embodiment of the present invention.
  • the artifact diagnosis system 1 includes a computer 10 and a public line network (Internet network, third generation, fourth generation communication network, etc.) 5 and is a computer system that diagnoses artifacts.
  • the computer 10 is the above-described computing device having the functions described later.
  • FIG. 3 is a functional block diagram of the computer 10.
  • the computer 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), etc. as the control unit 11, and a device for enabling communication with other devices as the communication unit 12. For example, a WiFi (Wireless Fidelity) compatible device compliant with IEEE 802.11 is provided.
  • the computer 10 also includes a data storage unit such as a hard disk, a semiconductor memory, a recording medium, or a memory card as the storage unit 13. Further, the computer 10 includes, as the processing unit 14, a device for executing various processes such as image processing, state diagnosis, and learning process.
  • control unit 11 reads a predetermined program, thereby realizing the artifact image acquisition module 20 and the diagnosis result acquisition module 21 in cooperation with the communication unit 12. Further, in the computer 10, the control unit 11 reads a predetermined program, thereby realizing the storage module 30 in cooperation with the storage unit 13. Further, in the computer 10, the control unit 11 reads a predetermined program, thereby realizing an analysis module 40, a learning module 41, a matching module 42, and a diagnosis module 43 in cooperation with the processing unit 14.
  • FIG. 4 is a diagram illustrating a flowchart of the learning process executed by the computer 10. Processing executed by each module described above will be described together with this processing.
  • the artifact image acquisition module 20 acquires a known second artifact image (step S10).
  • the second artifact image is obtained by making a plurality of images accompanied by a time-series change of the artifact into one image.
  • the second artifact image acquired by the artifact image acquisition module 20 is, for example, at least one of an X-ray image, an infrared image, an ultrasonic image, or a visible light image of the artifact.
  • the artifact image acquisition module 20 may acquire the second artifact image from the corresponding imaging device, may acquire it via a computer (not shown), or is stored in the computer or the like.
  • the second artifact image may be acquired from a database or the like. In the following description, the artifact image acquisition module 20 will be described as having acquired a pipe image as the second artifact image.
  • the analysis module 40 performs image analysis on either or both of the feature point and the feature amount from the acquired second artifact image (step S11).
  • the feature point is something reflected in the second artifact image, and specifically, the shape, brightness, color, contour, etc. of the artifact reflected in the image.
  • the feature amount is a statistical numerical value such as various numerical values (average pixel value, variance, histogram) calculated from the second artifact image.
  • the analysis module 40 extracts feature points and feature amounts by performing image analysis on the second artifact image. Specifically, the analysis module 40 performs image analysis on the second artifact image by executing image matching technology, blob analysis, and the like. Moreover, the analysis module 40 extracts a feature amount by executing a predetermined calculation on the second artifact image.
  • the diagnosis result acquisition module 21 acquires the diagnosis result of the artifact corresponding to the second artifact image acquired this time (step S12). In step S12, the diagnosis result acquisition module 21 acquires a diagnosis result for the artifact associated with the acquired second artifact image.
  • the diagnosis result acquisition module 21 may acquire this diagnosis result via a computer or the like (not shown), or acquire it from a database or the like stored in this computer or the like.
  • the diagnosis result in the present embodiment is, for example, identification of symptoms, identification of necessary treatment, and the like.
  • the learning module 41 learns by associating the second artifact image and the diagnosis result (step S13).
  • step S ⁇ b> 13 the learning module 41 learns by associating the second artifact image acquired by the artifact image acquisition module 20 with the diagnosis result acquired by the diagnosis result acquisition module 21.
  • the learning module 41 learns at least one of the above-described X-ray image, infrared image, ultrasonic image, or visible light image of the artifact in association with the diagnosis result.
  • the learning performed by the learning module 41 is machine learning that repeatedly learns from data and finds a pattern hidden in the learning.
  • or S13 mentioned above instead of the analysis module 40 extracting either a feature point or a feature-value, or both, a target location is detected with the terminal device etc. which the worker who is not shown in figure holds.
  • the learning module 41 may learn by associating the second artifact image with the diagnosis result. In this case, the analysis module 40 learns the marked second artifact image and the diagnosis result in association with each other.
  • the analysis module 40 performs image analysis on the marked image. That is, one or both of the feature point and the feature amount of the marked part are extracted.
  • the analysis module 40 may extract the area, shape, etc. of the marked part as a feature point or feature amount.
  • the storage module 30 stores the learning result as a learning result (step S14).
  • the artifact diagnosis system 1 executes the above-described learning process a sufficient number of times and stores the learning result.
  • FIG. 5 is a diagram illustrating a flowchart of the artifact diagnosis process executed by the computer 10. Processing executed by each module described above will be described together with this processing. In the following description, the artifact diagnosis system 1 will be described as diagnosing a pipe crack based on a pipe image.
  • the artifact image acquisition module 20 acquires a first artifact image (step S20).
  • the first artifact image is a single image formed from a plurality of images accompanying a time-series change of the artifact.
  • the first artifact image acquired by the artifact image acquisition module 20 is, for example, at least one of an X-ray image, an infrared image, an ultrasonic image, or a visible light image of the artifact.
  • the artifact image acquisition module 20 may acquire the first artifact image from a corresponding imaging device, may acquire via a computer not shown, or the like, and is stored in this computer or the like
  • the first artifact image may be acquired from a database or the like.
  • an image of an artificial object from before (for example, an image stored in a computer or the like not shown) and an image captured by an imaging device or the like this time are collected along a time-series change.
  • the first image is acquired as the first artifact image.
  • the artifact image acquisition module 20 will be described as having acquired a pipe image as the first artifact image.
  • the analysis module 40 performs image analysis on either or both of the feature point and the feature amount from the acquired first artifact image (step S21).
  • the feature points and feature amounts are as described above.
  • the analysis module 40 extracts feature points and feature amounts of the first artifact image, as in step S11 described above.
  • the analysis module 40 performs image analysis on the marked first artifact image.
  • the marked first artifact image is an image surrounded by a specific part of the image (a part where the hue is different, such as a part where a crack occurs or a part where the temperature is high). This mark is given by a terminal device (not shown) or automatically.
  • the analysis module 40 extracts feature points and feature amounts based on differences in image pigments in the marked area.
  • the artifact image acquisition module 20 acquires a second artifact image (step S22).
  • step S22 the artifact image acquisition module 20 acquires the learning result stored in the storage module 30 as the second artifact image by the process in step S14 described above. At this time, the second artifact image acquired by the artifact image acquisition module 20 is marked.
  • the artifact image acquisition module 20 may acquire not a learning result but a plurality of images accompanied by a time-series change of another artifact in the past as the second artifact image. Further, the artifact image acquisition module 20 may acquire a second artifact image that is not marked.
  • the analysis module 40 performs image analysis on either or both of the feature point and the feature amount from the acquired second artifact image (step S23).
  • the feature points and feature amounts of the second artifact image are extracted in the same manner as the processes of steps S11 and S21 described above.
  • the collation module 42 collates the result of image analysis of the first artifact image and the result of image analysis of the second artifact image (step S24).
  • step S24 the collation module 42 collates either or both of the feature points or feature amounts extracted from the first artifact image with either or both of the feature points analyzed from the second artifact image.
  • the matching module 42 checks the feature point of the first artifact image and the feature point of the second artifact image.
  • the feature amount is extracted from the first artifact image
  • the feature point of the first artifact image and the feature amount of the second artifact image are collated, and the feature point and feature amount are extracted from the first artifact image. Is extracted, the feature point and feature amount of the first artifact image are collated with the feature point and feature amount of the second artifact image.
  • a specific collation method will be described later.
  • FIG. 6 is a diagram illustrating a first artifact image and a second artifact image that are collated by the collation module 42.
  • the matching module 42 arranges the first artifact images (first piping image 200, second piping image 210, and third piping image 220) in the first artifact image display area 100, and The second artifact image (the fourth piping image 300, the fifth piping image 310) is aligned and collated in the two artifact image display area 110.
  • the first piping image 200, the second piping image 210, and the third piping image 220 show time-series changes of the artifacts in this order.
  • the 4th piping image 300 and the 5th piping image 310 show the change of the time series of another past artifact different from the 1st artifact image.
  • the fifth piping image 310 is a piping image in a state where a crack has occurred.
  • the collation module 42 collates the change from the fourth piping image 300 to the fifth piping image 310 with the first artifact image as a feature point or feature amount.
  • the collation module 42 determines, as a score, the degree of similarity of the degree of similarity between the change in the first artifact image and the change in the second artifact image. For example, this score is determined as a high score as there are many similarities in change, and as a low score as there are few similarities in change.
  • the collation module 42 collates the change in the first artifact image as a feature point or feature amount, and whether or not the similarity with the pipe image in the state where the second artifact image is cracked has a high score. It is determined whether or not the artifact is cracked.
  • the number of the first artifact images and the second artifact images that are collated by the collation module 42 is not limited to the above-described number, and can be changed as appropriate. Further, the collation module 42 may collate more artifact images in addition to the first artifact image and the second artifact image.
  • the diagnosis module 43 diagnoses the artifact based on the collated result (step S25).
  • the diagnosis module 43 diagnoses the artifact based on the similarity calculated by the matching module 42. For example, when the similarity is a score equal to or higher than a predetermined value as a result of the collation, the diagnosis module 43 gives a diagnosis result similar to the diagnosis result performed on the second artifact image to the artifact. Diagnose.
  • the diagnosis module 43 may diagnose a symptom or treatment itself as a diagnosis result, or may diagnose a risk related to a defect of an artifact.
  • the risk related to the defect is, for example, what percentage the probability of occurrence of the diagnosed defect is, what percentage is the probability of occurrence of the corresponding defect in the future.
  • the diagnostic module 43 further collates the other second artifact image to obtain another diagnosis result for this artifact. Diagnose it.
  • the means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program.
  • the program is provided, for example, in a form (SaaS: Software as a Service) provided from a computer via a network.
  • the program is provided in a form recorded on a computer-readable recording medium such as a flexible disk, CD (CD-ROM, etc.), DVD (DVD-ROM, DVD-RAM, etc.).
  • the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it.
  • the program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Biochemistry (AREA)
  • Biophysics (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Immunology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

L'objectif de l'invention est de fournir un système informatique, un procédé de diagnostic d'objet artificiel et un programme qui permettent, en combinant une pluralité d'ensembles de données d'image à séquence temporelle, d'améliorer la précision de diagnostic dans une plus grande mesure qu'un diagnostic classique basé sur une analyse d'image unique. À cet effet, l'invention concerne un système informatique diagnostiquant un objet artificiel qui acquiert une pluralité de premières images d'objet artificiel comprenant un changement temporel de l'objet artificiel, effectue une analyse d'image sur les premières images d'objet artificiel acquises, acquiert une pluralité de secondes images d'objet artificiel obtenues dans le passé et comprenant un changement temporel d'un autre objet artificiel, effectue une analyse d'image sur les secondes images d'objet artificiel acquises, compare le résultat d'analyse d'image des premières images d'objet artificiel avec le résultat d'analyse d'image des secondes images d'objet artificiel, puis diagnostique l'objet artificiel d'après le résultat de comparaison.
PCT/JP2017/013821 2017-03-31 2017-03-31 Système informatique, procédé de diagnostic d'objet artificiel et programme WO2018179421A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/013821 WO2018179421A1 (fr) 2017-03-31 2017-03-31 Système informatique, procédé de diagnostic d'objet artificiel et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/013821 WO2018179421A1 (fr) 2017-03-31 2017-03-31 Système informatique, procédé de diagnostic d'objet artificiel et programme

Publications (1)

Publication Number Publication Date
WO2018179421A1 true WO2018179421A1 (fr) 2018-10-04

Family

ID=63677510

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/013821 WO2018179421A1 (fr) 2017-03-31 2017-03-31 Système informatique, procédé de diagnostic d'objet artificiel et programme

Country Status (1)

Country Link
WO (1) WO2018179421A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63132381A (ja) * 1986-11-25 1988-06-04 Hitachi Ltd 画像デ−タ検索・表示システム
JP2010165127A (ja) * 2009-01-14 2010-07-29 Canon Inc 情報処理装置および情報処理方法
JP2016012346A (ja) * 2014-06-04 2016-01-21 パナソニック株式会社 制御方法及びプログラム
JP2016045662A (ja) * 2014-08-21 2016-04-04 富士フイルム株式会社 類似画像検索装置、類似画像検索装置の作動方法、および類似画像検索プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63132381A (ja) * 1986-11-25 1988-06-04 Hitachi Ltd 画像デ−タ検索・表示システム
JP2010165127A (ja) * 2009-01-14 2010-07-29 Canon Inc 情報処理装置および情報処理方法
JP2016012346A (ja) * 2014-06-04 2016-01-21 パナソニック株式会社 制御方法及びプログラム
JP2016045662A (ja) * 2014-08-21 2016-04-04 富士フイルム株式会社 類似画像検索装置、類似画像検索装置の作動方法、および類似画像検索プログラム

Similar Documents

Publication Publication Date Title
JP7340265B2 (ja) 異常検出装置、異常検出方法、およびプログラム
JP6924413B2 (ja) データ生成装置、データ生成方法及びデータ生成プログラム
US11715190B2 (en) Inspection system, image discrimination system, discrimination system, discriminator generation system, and learning data generation device
KR101822404B1 (ko) Dnn 학습을 이용한 세포이상 여부 진단시스템
US11580629B2 (en) System and method for determining situation of facility by imaging sensing data of facility
CN112955837B (zh) 异常诊断装置、异常诊断方法、及程序
WO2018078868A1 (fr) Système informatique et procédé et programme de diagnostic d'objets
CN110388879B (zh) 检查装置
JP6356920B2 (ja) 故障診断装置、故障診断方法、及び故障診断プログラム
JP7621586B2 (ja) 生産ラインスマート監視システム及び監視方法
JP5718781B2 (ja) 画像分類装置および画像分類方法
CN113095438A (zh) 晶圆缺陷分类方法及其装置、系统、电子设备和存储介质
US20200065967A1 (en) Computer system, method, and program for diagnosing subject
JP7056259B2 (ja) 検査システム、識別システム、及び識別器評価装置
CN113052295B (zh) 一种神经网络的训练方法、物体检测方法、装置及设备
US20240046443A1 (en) Analysis apparatus, analysis system, analysis program, and analysis method
KR102782727B1 (ko) 자동 분할 태깅 장치 및 이를 이용하여 학습된 손상영역 검출 장치
CN112308148A (zh) 缺陷类别识别、孪生神经网络训练方法、装置及存储介质
JP7095988B2 (ja) 異常監視システム、異常監視方法及びプログラム
US11120541B2 (en) Determination device and determining method thereof
CN113139932A (zh) 一种基于集成学习的深度学习缺陷图像识别方法及系统
WO2018179421A1 (fr) Système informatique, procédé de diagnostic d'objet artificiel et programme
CN109711434A (zh) 获取和评估vqa系统的训练数据的方法、装置、设备和介质
CN114187292B (zh) 棉纺纸筒的异常检测方法、装置、设备及存储介质
WO2018179221A1 (fr) Système informatique, procédé de diagnostic d'objet et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17903874

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17903874

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP