WO2018167974A1 - Dispositif de traitement d'images, procédé de commande et programme de commande - Google Patents
Dispositif de traitement d'images, procédé de commande et programme de commande Download PDFInfo
- Publication number
- WO2018167974A1 WO2018167974A1 PCT/JP2017/011039 JP2017011039W WO2018167974A1 WO 2018167974 A1 WO2018167974 A1 WO 2018167974A1 JP 2017011039 W JP2017011039 W JP 2017011039W WO 2018167974 A1 WO2018167974 A1 WO 2018167974A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- numerical value
- meter
- digit
- partial area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/98—Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
Definitions
- the present disclosure relates to an image processing device, a control method, and a control program, and more particularly, to an image processing device, a control method, and a control program for recognizing a numerical value in a meter from an image captured by the meter.
- An instrument reading device that captures a scene including a numerical value indicated by a digital instrument, processes the captured image, and detects a numerical value indicated by the instrument (see Patent Document 1).
- the purpose of the image processing apparatus, control method, and control program is to enable efficient storage of appropriate images as evidence images.
- An image processing apparatus specifies a storage unit, an imaging unit that sequentially generates an input image obtained by capturing a meter, and a numerical value in the meter that is reflected in each of the sequentially generated input images.
- the numerical value recognition unit for recognizing the numerical value in the meter based on the counting result, and whether or not the partial area corresponding to each digit is clear for each digit corresponding to each digit of the numerical value in the meter. For each partial region to be determined, the determination unit selects an input image for which the partial region is determined to be clear, and associates the numerical value recognized by the numerical value recognition unit with at least a part of the selected input image as an evidence image And a control unit stored in the storage unit.
- a control method is a control method for an image processing apparatus that includes a storage unit and an imaging unit that sequentially generates input images obtained by capturing a meter.
- the numerical values in the meter reflected in the meter are identified and counted, and the numerical values in the meter are recognized based on the counting results, and each partial area is clear for each partial area corresponding to each digit in the meter.
- a numerical value that determines whether or not there is an image, and for each digit, selects an input image for which the partial area corresponding to each digit is determined to be clear, and recognizes at least part of the selected input image as an evidence image Are stored in the storage unit in association with each other.
- a control program is a control program for an image processing apparatus that includes a storage unit and an imaging unit that sequentially generates input images obtained by photographing a meter.
- the numerical values in the meter reflected in the meter are identified and counted, and the numerical values in the meter are recognized based on the counting results, and each partial area is clear for each partial area corresponding to each digit in the meter.
- a numerical value that determines whether or not there is an image, and for each digit, selects an input image for which the partial area corresponding to each digit is determined to be clear, and recognizes at least part of the selected input image as an evidence image Are associated with each other and stored in the storage unit.
- the image processing apparatus, the control method, and the control program can efficiently store an appropriate image as an evidence image.
- FIG. 2 is a diagram illustrating a schematic configuration of a storage device 110 and a CPU 120.
- FIG. It is a flowchart which shows the example of operation
- 6 is a diagram showing a schematic configuration of another processing circuit 230.
- FIG. 1 is a diagram illustrating an example of a schematic configuration of an image processing apparatus 100 according to the embodiment.
- the image processing apparatus 100 is a portable information processing apparatus such as a tablet PC, a multi-function mobile phone (so-called smart phone), a portable information terminal, and a notebook PC, and is used by an operator who is the user.
- the image processing apparatus 100 includes a communication device 101, an input device 102, a display device 103, an imaging device 104, a storage device 110, a CPU (Central Processing Unit) 120, and a processing circuit 130.
- a communication device 101 an input device 102, a display device 103, an imaging device 104, a storage device 110, a CPU (Central Processing Unit) 120, and a processing circuit 130.
- CPU Central Processing Unit
- the communication device 101 includes a communication interface circuit including an antenna mainly having a 2.4 GHz band, a 5 GHz band, or the like as a sensitive band.
- the communication apparatus 101 performs wireless communication with an access point or the like based on an IEEE (The Institute of Electrical and Electronics Electronics, Inc.) 802.11 standard wireless communication system.
- the communication device 101 transmits / receives data to / from an external server device (not shown) via an access point.
- the communication apparatus 101 supplies the data received from the server apparatus via the access point to the CPU 120, and transmits the data supplied from the CPU 120 to the server apparatus via the access point.
- the communication device 101 may be any device that can communicate with an external device.
- the communication device 101 may communicate with a server device via a base station device (not shown) according to a mobile phone communication method, or may communicate with a server device according to a wired LAN communication method.
- the input device 102 has a touch panel type input device, an input device such as a keyboard and a mouse, and an interface circuit that acquires signals from the input device.
- the input device 102 receives a user input and outputs a signal corresponding to the user input to the CPU 120.
- the display device 103 includes a display composed of liquid crystal, organic EL (Electro-Luminescence), and the like, and an interface circuit that outputs image data or various information to the display.
- the display device 103 is connected to the CPU 120 and displays the image data output from the CPU 120 on a display. Note that the input device 102 and the display device 103 may be integrally configured using a touch panel display.
- the imaging device 104 includes a reduction optical system type imaging sensor including an imaging element made up of a CCD (Charge Coupled Device) arranged one-dimensionally or two-dimensionally, and an A / D converter.
- the imaging device 104 is an example of an imaging unit, and sequentially captures a meter according to an instruction from the CPU 120 (for example, 30 frames / second).
- the image sensor generates an analog image signal obtained by photographing the meter and outputs the analog image signal to the A / D converter.
- the A / D converter performs analog-digital conversion on the output analog image signal to sequentially generate digital image data, and outputs the digital image data to the CPU 120.
- an equal magnification optical system type CIS Contact Image Sensor
- CMOS Complementary Metal Metal Oxide Semiconductor
- digital image data output by the meter captured by the imaging device 104 may be referred to as an input image.
- the storage device 110 is an example of a storage unit.
- the storage device 110 includes a memory device such as a RAM (Random Access Memory) and a ROM (Read Only Memory), a fixed disk device such as a hard disk, or a portable storage device such as a flexible disk and an optical disk. Further, the storage device 110 stores computer programs, databases, tables, and the like used for various processes of the image processing apparatus 100.
- the computer program may be installed from a computer-readable portable recording medium such as a CD-ROM (compact disk read only memory) or a DVD ROM (digital versatile disk read only memory).
- the computer program is installed in the storage device 110 using a known setup program or the like.
- the storage device 110 also stores a management table that manages information related to each input image.
- the CPU 120 operates based on a program stored in the storage device 110 in advance.
- the CPU 120 may be a general purpose processor. Instead of the CPU 120, a DSP (digital signal processor), an LSI (large scale integration), or the like may be used. Instead of the CPU 120, an ASIC (Application Specific Integrated Circuit), an FPGA (Field-Programmable Gate Array), or the like may be used.
- DSP digital signal processor
- LSI large scale integration
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- the CPU 120 is connected to the communication device 101, the input device 102, the display device 103, the imaging device 104, the storage device 110, and the processing circuit 130, and controls these units.
- the CPU 120 performs data transmission / reception control via the communication device 101, input control of the input device 102, display control of the display device 103, imaging control of the imaging device 104, control of the storage device 110, and the like. Further, the CPU 120 recognizes a numerical value in the meter reflected in the input image generated by the imaging device 104 and stores the evidence image in the storage device 110.
- the processing circuit 130 performs predetermined image processing such as correction processing on the input image acquired from the imaging device 104.
- predetermined image processing such as correction processing on the input image acquired from the imaging device 104.
- an LSI, DSP, ASIC, FPGA, or the like may be used as the processing circuit 130.
- FIG. 2 is a diagram showing a schematic configuration of the storage device 110 and the CPU 120.
- the storage device 110 stores programs such as a numerical value recognition program 111, a determination program 112, and a control program 113.
- Each of these programs is a functional module implemented by software operating on the processor.
- the CPU 120 functions as the numerical value recognition unit 121, the determination unit 122, and the control unit 123 by reading each program stored in the storage device 110 and operating according to each read program.
- FIG. 3 is a flowchart showing an example of the operation of the entire process performed by the image processing apparatus 100.
- the operation flow described below is mainly executed by the CPU 120 in cooperation with each element of the image processing apparatus 100 based on a program stored in the storage device 110 in advance.
- the numerical value recognition unit 121 receives a shooting start instruction when a user inputs a shooting start instruction for instructing the start of shooting with the input device 102 and receives a shooting start instruction signal from the input device 102. (Step S101).
- the numerical value recognition unit 121 initializes information used for image processing, sets parameters such as the shooting size and focus of the imaging device 104, and causes the imaging device 104 to take a meter. To generate an input image.
- the numerical value recognition unit 121 sequentially stores input images sequentially generated by the imaging device 104 in the storage device 110.
- the numerical value recognition unit 121 executes a partial area detection process (step S102).
- the numerical value recognition unit 121 detects a partial area corresponding to each digit of the numerical value in the meter shown in the input image generated by the imaging device 104. Details of the partial area detection processing will be described later.
- the numerical value recognition unit 121 determines whether a partial area that can be used in the numerical value recognition process is detected in the partial area detection process (step S103).
- the numerical value recognition unit 121 When the usable partial area is not extracted in the numerical value recognition process, the numerical value recognition unit 121 returns the process to step S102, and executes the partial area extraction process on the newly generated input image. On the other hand, when a partial area that can be used in the numerical value recognition process is extracted, the numerical value recognition unit 121 executes the numerical value recognition process (step S104). In the numerical value recognition processing, the numerical value recognition unit 121 identifies and counts the numerical values in the meter shown in the sequentially generated input images, and recognizes the numerical values in the meter based on the totaled result. Details of the numerical value recognition processing will be described later.
- the numerical value recognition unit 121 determines whether or not the numerical value in the meter has been recognized in the numerical value recognition process (step S105).
- the numerical value recognition unit 121 When the numerical value in the meter cannot be recognized, the numerical value recognition unit 121 returns the process to step S102, and repeats the processes of steps S102 to S105 for the newly generated input image.
- the determination unit 122 and the control unit 123 execute evidence image storage processing (step S106).
- the determination unit 122 determines whether each partial area is clear. Further, for each digit, the control unit 123 selects an input image in which the partial area corresponding to each digit is determined to be clear, and uses the image corresponding to the selected input image as an evidence image.
- the numerical value recognized by 121 is associated and stored in the storage device 110. Details of the evidence image storage process will be described later.
- control unit 123 displays the numerical value recognized by the numerical value recognition unit 121 and / or the evidence image selected by the control unit 123 on the display device 103 (step S107), and ends a series of steps. Further, the control unit 123 may transmit the numerical value recognized by the numerical value recognition unit 121 and / or the evidence image selected by the control unit 123 to the server device via the communication device 101.
- FIG. 4 is a flowchart showing an example of the operation of the partial area detection process. The operation flow shown in FIG. 4 is executed in step S102 of the flowchart shown in FIG.
- the numerical value recognition unit 121 detects a plate frame from the input image (step S201).
- FIG. 5A is a diagram showing an example of an input image 500 obtained by photographing a meter (device).
- a meter has a black casing 501 and a white plate 502 inside the casing 501.
- the plate 502 is visible through glass (not shown), and a meter portion 503 on which a numerical value such as the amount of electric power measured by the meter is displayed is disposed on the plate 502.
- the numerical value is shown in white and the background is shown in black.
- the numerical value recognition unit 121 detects the outer edge of the plate 502 as a plate frame.
- a meter whose number of numerical values to be measured is four will be described as an example, but any number of numerical values to be measured by the meter may be any number as long as it is two or more.
- the numerical value recognition unit 121 is a difference between luminance values or color values (R value, B value, G value) of pixels adjacent to each other in the horizontal and vertical directions in the input image or a plurality of pixels separated from the pixels by a predetermined distance. If the absolute value of exceeds the first threshold, the pixel is extracted as an edge pixel.
- the numerical value recognition unit 121 extracts a straight line that passes through the vicinity of each extracted edge pixel by using the Hough transform or the least squares method, and two of the extracted straight lines are obtained from four straight lines that are approximately orthogonal to each other. Among the rectangles to be configured, the largest rectangle is detected as a plate frame.
- the numerical value recognition unit 121 determines whether each extracted edge pixel is connected to other edge pixels, and labels the connected edge pixels as one group.
- the numerical value recognition unit 121 may detect, as a plate frame, an outer edge of a region surrounded by the largest group among the extracted groups.
- the numerical value recognition unit 121 may detect the plate frame using the difference between the color of the housing 501 and the color of the plate 502.
- the numerical value recognition unit 121 has a luminance value or a color value of each pixel that is less than the second threshold value (shows black), and a pixel adjacent to the pixel on the right side or a pixel separated from the pixel by a predetermined distance on the right side If the color value is greater than or equal to the second threshold (indicating white), that pixel is extracted as the left edge pixel.
- the second threshold value is set to an intermediate value between the value indicating black and the value indicating white.
- the numerical value recognition unit 121 has a luminance value or a color value of each pixel that is less than the second threshold value and that is adjacent to the pixel on the left side or a pixel that is a predetermined distance away from the pixel on the left side. Is greater than or equal to the second threshold, the pixel is extracted as the right edge pixel. Similarly, the numerical value recognition unit 121 has a luminance value or a color value of each pixel that is less than the second threshold value, and the luminance value or the pixel of the pixel adjacent to the pixel on the lower side or the pixel separated by a predetermined distance from the pixel on the lower side. If the color value is greater than or equal to the second threshold, the pixel is extracted as the upper edge pixel.
- the numerical value recognition unit 121 has a luminance value or a color value of each pixel that is less than the second threshold value and is adjacent to the pixel on the upper side or a pixel that is a predetermined distance above the pixel. Is greater than or equal to the second threshold, the pixel is extracted as the lower edge pixel.
- the numerical value recognition unit 121 extracts a straight line connecting the extracted left end edge pixel, right end edge pixel, upper end edge pixel, and lower end edge pixel by using Hough transform or least square method, and is configured from the extracted straight lines.
- the rectangle to be detected is detected as a plate frame.
- the numerical value recognition unit 121 detects a numerical background frame from the detected area within the plate frame (step S202).
- the numerical value recognition unit 121 detects the outer edge of the meter portion 503 as a numerical value background frame.
- the numerical value recognition unit 121 uses a discriminator that has been pre-learned to output the position information of the outer edge of the meter portion 503 when an image showing the plate 502 including the meter portion 503 is input. To detect. This discriminator is pre-learned using a plurality of images obtained by photographing the meter, for example, by deep learning, and stored in the storage device 110 in advance. The numerical value recognition unit 121 detects the numerical value background frame by inputting an image including the detected plate frame to the classifier and acquiring the position information output from the classifier.
- the numerical value recognition unit 121 may detect the numerical value background frame based on the edge pixels in the input image, as in the case of detecting the plate frame.
- the numerical value recognition unit 121 extracts edge pixels from the region including the plate frame of the input image, extracts straight lines passing through the vicinity of each extracted edge pixel, and two of the extracted straight lines are approximately four orthogonal to each other. The largest rectangle among the rectangles composed of straight lines is detected as a numerical background frame.
- the numerical value recognition part 121 may detect the outer edge of the area
- the numerical value recognition unit 121 may detect the plate frame using the difference between the color of the plate 502 and the color of the meter portion 503, as in the case of detecting the plate frame.
- the numerical value recognition unit 121 has a luminance value or a color value of each pixel equal to or greater than a second threshold value (indicating white), and the luminance value of a pixel adjacent to the pixel on the right side or a pixel separated by a predetermined distance from the pixel to the right side If the color value is less than the second threshold value (indicating black), the pixel is extracted as the left edge pixel.
- the numerical value recognition unit 121 extracts the right edge pixel, the upper edge pixel, and the lower edge pixel.
- the numerical value recognition unit 121 extracts straight lines that pass through the vicinity of the extracted left end edge pixel, right end edge pixel, upper end edge pixel, and lower end edge pixel by using the Hough transform or the least square method. Is detected as a numeric background frame.
- the numerical value recognition unit 121 detects the mark 504 and is sandwiched between the marks 504 in the horizontal and vertical directions.
- a numerical background frame may be detected within the region.
- the numerical value recognition unit 121 detects a partial region corresponding to each digit of the numerical value in the meter from the detected region in the numerical value background frame (step S203).
- FIG. 5B is a diagram for explaining a partial region.
- An image 510 shown in FIG. 5B shows a numerical background frame of the meter portion 503 detected from the input image 500.
- the numerical value recognition unit 121 detects the rectangular areas 511 to 514 including the numerical value values in the numerical value background frame as partial areas.
- the numerical value recognition unit 121 is a classifier that has been pre-learned so as to output position information of each rectangular area including each digit value of the numerical value in the meter portion 503 when an image in which the meter portion 503 is reflected is input. Thus, the partial area is detected.
- This discriminator is pre-learned using a plurality of images obtained by photographing the meter, for example, by deep learning, and stored in the storage device 110 in advance.
- the numerical value recognition unit 121 detects a partial region by inputting an image including the detected numerical value background frame to a classifier and acquiring position information output from the classifier.
- the numerical value recognition unit 121 may detect the numerical value background frame based on the edge pixels in the input image, as in the case of detecting the plate frame.
- the numerical value recognition unit 121 extracts edge pixels from the area including the numerical value background frame of the input image, extracts straight lines passing through the vicinity of the extracted edge pixels, and two of the extracted straight lines are approximately orthogonal to each other. The largest rectangular area is detected from the four straight lines.
- the numerical value recognition unit 121 detects a region surrounded by a group having the largest area among the groups in which the extracted edge pixels are connected to each other.
- the numerical value recognition unit 121 detects a single-digit number from each detected region using a known OCR (Optical Character Recognition) technology, and detects a single-digit number as a partial region. To do.
- OCR Optical Character Recognition
- An image 520 shown in FIG. 5B shows an image in which the entire meter portion 503 is unclear and the difference in luminance value or color value between the numerical value portion and the background portion is small. In the image 520, the rectangular areas 521 to 524 are not detected as partial areas.
- An image 530 shown in FIG. 5B shows an image in which ambient light such as illumination is reflected on the glass portion covering the front surface of the meter, so that disturbance light is reflected on the entire meter portion 503 and all numerical values cannot be identified. In the image 530, the rectangular areas 531 to 534 are not detected as partial areas.
- FIG. 5B shows an image in which disturbance light is reflected in a part of the meter portion 503 and a numerical value in which disturbance light is reflected cannot be identified.
- the rectangular areas 541, 543, and 544 are detected as partial areas, but the rectangular area 542 is not detected as a partial area.
- An image 550 shown in FIG. 5B shows an image in which a shadow is reflected in a part of the meter portion 503 and a numerical value where the shadow is reflected is erroneously recognized.
- the rectangular areas 551 to 554 are detected as partial areas.
- the numerical value recognition unit 121 assigns a digit number to each detected partial area (step S204).
- the numerical value recognition unit 121 divides the region in the numerical value background frame equally in the horizontal direction by the number of digits of the numerical value indicated by the meter, and assigns the digit numbers in ascending order from the right region (1 from the rightmost region in order). 2, 3, 4 digit numbers are assigned).
- the numerical value recognition unit 121 assigns a digit number assigned to an area including the center position of each partial area to each detected partial area.
- the numerical value recognition unit 121 determines whether or not each detected partial area can be used in the numerical value recognition process (step S205).
- the numerical value recognition unit 121 determines whether or not each partial area can be used in the numerical value recognition process based on whether or not each partial area includes blur or shine.
- a blur is a region where the difference in luminance value of each pixel in the image is small due to defocusing of the imaging device 104, or the same object is reflected in a plurality of pixels in the image due to a user's camera shake. It means an area where the difference in luminance value of each pixel is small.
- the term “shine” means a region where the luminance value of a pixel in a predetermined region in the image is saturated (out-of-white) due to the influence of disturbance light or the like.
- the numerical value recognition unit 121 includes blur in each partial region by a discriminator that is pre-learned to output a blur degree indicating the degree of blur included in the input image when the image is input. Determine whether or not.
- This discriminator is pre-learned using an image obtained by photographing a meter and not including blur by, for example, deep learning, and is stored in the storage device 110 in advance. Note that this discriminator may be pre-learned using an image obtained by photographing a meter and including blur.
- the numerical value recognition unit 121 inputs an image including the detected partial area to the classifier, and whether or not the partial area includes blur depending on whether or not the degree of blur output from the classifier is equal to or greater than the third threshold. Determine.
- the numerical value recognition unit 121 may determine whether or not each partial area includes blur based on the edge strength of the luminance value of each pixel included in the partial area.
- the numerical value recognition unit 121 calculates the absolute value of the difference between the luminance values of the pixels adjacent to each other in the horizontal or vertical direction in the partial region or a plurality of pixels separated from the pixel by a predetermined distance as the edge strength of the pixel. .
- the numerical value recognition unit 121 determines whether or not the partial area includes blur depending on whether or not the average value of the edge intensities calculated for each pixel in the partial area is equal to or less than the fourth threshold value.
- the numerical value recognition unit 121 may determine whether or not each partial area includes blur based on the distribution of luminance values of the pixels included in the partial area.
- the numerical value recognition unit 121 generates a histogram of the luminance value of each pixel in the partial region, and detects a local maximum value in each of the luminance value range indicating the numerical value (white) and the luminance value range indicating the background (black). Then, the average value of the full width at half maximum of each maximum value is calculated.
- the numerical value recognition unit 121 determines whether or not the partial area includes blur depending on whether or not the average value of the calculated half-value widths of the local maximum values is equal to or greater than the fifth threshold value.
- the numerical value recognition unit 121 includes shine in each partial region by a discriminator that is pre-learned so as to output a degree of shine indicating the degree of shine included in the input image when an image is input. It is determined whether or not.
- This discriminator is pre-learned using, for example, an image obtained by photographing a meter and not including shine by deep learning or the like, and is stored in the storage device 110 in advance. Note that this discriminator may be pre-learned using an image obtained by photographing a meter and including shine.
- the numerical value recognition unit 121 inputs an image including the detected partial area to the classifier, and whether or not the partial area includes shine depending on whether or not the degree of shine output from the classifier is equal to or greater than a sixth threshold. Determine.
- the numerical value recognition unit 121 may determine whether or not each partial area includes shine based on the luminance value of each pixel included in the partial area.
- the numerical value recognition unit 121 calculates the number of pixels whose luminance value is greater than or equal to the seventh threshold value (white) among the pixels in the partial region, and determines whether or not the calculated number is equal to or greater than the eighth threshold value. It is determined whether or not shine is included.
- the numerical value recognition unit 121 may determine whether or not each partial area includes a shine based on the distribution of luminance values of each pixel included in the partial area.
- the numerical value recognition unit 121 generates a histogram of the luminance value of each pixel in the partial area, and determines whether or not the partial area depends on whether the number of pixels distributed in the area equal to or higher than the seventh threshold is equal to or higher than the eighth threshold. Whether or not is included is determined.
- each threshold value and each range described above are set in advance by a prior experiment.
- the numerical value recognition unit 121 stores the information of the partial area determined to be usable in the management table of the storage device 110 (step S206), and ends a series of steps.
- FIG. 6 is a diagram showing an example of the data structure of the management table.
- the input image ID is uniquely assigned for each input image.
- the evidence image information is information indicating the storage destination of the evidence image.
- the evidence image is an image stored as an evidence image.
- the evidence image for example, an image obtained by cutting out an area within the numerical background frame from the input image is used.
- the evidence image is an example of at least a part of the input image.
- the partial area information is information indicating the storage destination of the partial area image, and is stored for each partial area corresponding to each digit.
- the partial area image is an image obtained by cutting out a partial area from the input image.
- the digit value is a digit value specified in each partial area, and is stored for each digit.
- the numerical value recognition unit 121 cuts out each partial area determined to be usable from the input image, generates a partial area image, stores the partial area image in the storage device 110, and stores the storage destination in the management table as partial area information. . Note that the numerical value recognition unit 121 does not cut out each partial area determined to be usable from the input image, but may store position information of each partial area in the input image as partial area information in the management table. Good.
- the numerical value recognition unit 121 when there is a partial area determined to be usable in the input image, the numerical value recognition unit 121 generates an evidence image by cutting out the area in the numerical background frame from the input image, and stores it in the storage device 110. In addition to storing, the storage location is stored in the management table as evidence image information. Note that an image obtained by cutting out an area within the plate frame from the input image or the input image itself may be used as the evidence image. In that case, the numerical value recognition unit 121 stores the image obtained by cutting out the region in the plate frame from the input image or the input image itself as an evidence image in the storage device 110, and stores the storage location as evidence image information in the management table. To remember.
- FIG. 7 is a flowchart showing an example of the operation of numerical value recognition processing.
- the operation flow shown in FIG. 7 is executed in step S104 of the flowchart shown in FIG.
- Each process of steps S301 to S309 in FIG. 7 is executed for each partial area determined to be usable in the numerical value recognition process. That is, each process of steps S301 to S309 in FIG. 7 is executed for each digit of the numerical value in the meter that is shown in the input image sequentially generated by the imaging device 104.
- the numerical value recognition unit 121 determines whether or not the digit value of the digit number assigned to the partial area to be processed has been confirmed (step S301).
- the numerical value recognition unit 121 moves the process to step S310. On the other hand, when the digit value of the digit number has not been determined, the numerical value recognition unit 121 identifies the digit value reflected in the partial area (step S302).
- the numerical value recognition unit 121 when an image including a single-digit numerical value is input, the digit value reflected in the partial region by a classifier that has been previously learned so as to output the numerical value reflected in the image. Is identified. This discriminator is pre-learned using a plurality of images obtained by photographing each numerical value in the meter, for example, by deep learning, and stored in the storage device 110 in advance.
- the numerical value recognition unit 121 inputs an image including the partial area to the discriminator, and specifies the numerical value output from the discriminator as a digit value reflected in the partial area. Note that the numerical value recognition unit 121 may specify a digit value shown in the partial area using a known OCR technique.
- the numerical value recognition unit 121 converts the specified numerical value into an integer (step S303).
- a meter displays a measured value on a plate by rotating a cylindrical drum having a plurality of numbers printed on its side surface. Therefore, if the meter is photographed while the printed numbers are moving as the drum rotates, the partial area may include a plurality of numerical values. Therefore, the discriminator is pre-learned so as to output a decimal number according to the area in which each numerical value is shown when the partial area includes a plurality of numerical values.
- the numerical value recognition unit 121 rounds up, rounds, or rounds off the decimal part of the numerical value output from the discriminator, and converts the numerical value into an integer.
- the numerical value recognition unit 121 stores the specified numerical value in the management table of the storage device 110 as the digit value of the digit number assigned to the partial area (step S304).
- the numerical value recognition unit 121 determines whether or not numerical value recognition processing has been performed on a predetermined number (for example, 10) or more input images (step S305).
- the numerical value recognition unit 121 moves the process to step S310.
- the numerical value recognition unit 121 specifies the mode value among the digit values specified for the most recent predetermined number of input images (Ste S306).
- the numerical value recognizing unit 121 refers to the management table and identifies the most frequently stored digit value among the predetermined number of digit values closest to the digit number assigned to the partial area to be processed as the mode value. .
- the numerical value recognition unit 121 calculates the ratio of the number of occurrences (in the most recent predetermined number) of the specified mode value to the predetermined number (step S307).
- the numerical value recognition unit 121 determines whether or not the calculated ratio exceeds the ninth threshold value (step S308).
- the ninth threshold is set to 50%.
- the numerical value recognition unit 121 When the calculated ratio does not exceed the ninth threshold value, the numerical value recognition unit 121 considers that the mode value is not yet a reliable value, and proceeds to step S310. On the other hand, when the calculated ratio exceeds the ninth threshold value, the numerical value recognition unit 121 identifies the identified mode value as a confirmed digit value (step S309). In this way, the numerical value recognition unit 121 determines the digit value only when the calculated ratio exceeds the ninth threshold value, so that the reliability of the recognized numerical value can be further increased.
- the numerical value recognition unit 121 determines whether or not the processing has been completed for all the detected partial areas (step S310).
- the numerical value recognition unit 121 determines whether or not the digit values of all the digit numbers have been determined (step S311).
- the numerical value recognition unit 121 ends the series of steps without executing any particular process.
- the numerical value recognition unit 121 recognizes the numerical value obtained by combining the confirmed digit values specified for each of all the digits as the numerical value in the meter (step S312). This step is finished.
- the numerical value recognition unit 121 identifies and counts the numerical values in the meter shown in the sequentially generated input images for each digit, and recognizes the numerical values in the meter based on the counting results.
- the numerical value recognition unit 121 specifies the numerical values of other digits even for an input image in which a numerical value of a specific digit cannot be specified, and uses it for aggregation. Therefore, the numerical value in the meter can be accurately obtained using fewer input images. Can be recognized. Since the user does not need to continue to capture the meter until an input image that can identify all the digits is generated, the image processing apparatus 100 can improve user convenience.
- the numerical value recognition unit 121 may identify and count all the numerical values in the meter shown in each of the sequentially generated input images, and recognize the numerical values in the meter based on the counting result. .
- step S305 the numerical value recognizing unit 121 performs the processes in and after step S306 if the number of input images on which the numerical value recognition processing has been executed is not equal to or greater than a predetermined number, but the number of digits can be determined. May be executed.
- the predetermined number is 10 and the ninth threshold value is 50%
- the digit values specified for each input image are all the same when the number of input images subjected to the numerical value recognition process is six.
- the digit value is the mode value, and the ratio of the number of occurrences of the mode value is 60% or more.
- the numerical value recognition unit 121 may determine the numerical value to be recognized even if the number of input images on which numerical value recognition processing has been executed is not equal to or greater than a predetermined number. Thereby, the numerical value recognition part 121 can shorten the recognition time by numerical value recognition processing.
- FIG. 8 is a flowchart showing an example of operation of evidence image storage processing. The operation flow shown in FIG. 8 is executed in step S106 of the flowchart shown in FIG.
- the determination unit 122 determines whether each partial region is clear for every partial region detected by the numerical value recognition unit 121 (step S401). Note that the determination unit 122 determines whether or not each partial region is clear only for the partial region in which the digit value specified by the numerical value recognition unit 121 matches the corresponding digit value in the numerical value recognized by the numerical value recognition unit 121. You may judge.
- the clear partial area means that the numerical value included in the partial area can be recognized, and that the partial area does not include blur or shine.
- that the partial area is unclear means that the numerical value included in the partial area cannot be recognized, and means that the partial area includes blur or shine.
- the determination unit 122 determines whether each partial area is clear depending on whether each partial area includes blur or shine.
- the determination unit 122 determines whether each partial region includes blur or shine by using a discriminator that has captured a meter and previously learned using at least an image that does not include blur or shine. Determine whether. Alternatively, the determination unit 122 determines whether each partial area includes blur or shine, based on the luminance value of the image in each partial area, in the same manner as the process of step S205. However, the determination unit 122 makes it easier to determine that each partial area includes blur or shine than in the process of step S205.
- the determination unit 122 makes the third threshold value, the fifth threshold value, the sixth threshold value, or the eighth threshold value smaller than the value used in the process of step S205, and the fourth threshold value is a value used in the process of step S205. Larger than.
- the determination unit 122 may determine whether or not each partial region includes blur or shine according to the same determination criterion as that in step S205. In that case, the determination unit 122 may determine whether each partial region includes blur or shine using the determination result in the process of step S205. Thereby, the determination part 122 can shorten the processing time by determination processing, and can reduce processing load.
- control unit 123 determines whether or not there is one input image for which it is determined that the partial areas corresponding to all the digits are clear (step S402).
- control unit 123 selects the one input image.
- the control unit 123 stores only the evidence image corresponding to the selected one input image as an evidence image, associates the numerical values recognized by the numerical value recognition unit 121 with each other, and stores them in the storage device 110 (step S403). End the step.
- step S404 determines whether a partial evidence image is used as the evidence image. A digit whose region is not determined to be clear is identified (step S404). When the process of step S404 is first executed, all the digits are specified as digits for which the partial area is not determined to be clear.
- control unit 123 selects an input image including the largest number of partial areas determined to be clear for the specified digit as an input image using the corresponding evidence image as the evidence image (step S405). Thus, the control unit 123 selects an input image for which each partial area is determined to have a clear partial area corresponding to each digit.
- control unit 123 determines whether or not there remains a digit for which the partial area is not determined to be clear in any of the selected input images (step S406).
- control unit 123 If there are still digits that have not been determined to be clear, the control unit 123 returns the process to step S404 to re-specify digits that have not been determined to be clear. On the other hand, if there are no remaining digits that are not determined to be clear in the partial area, the control unit 123 stores the evidence images corresponding to the plurality of input images selected in step S405 in the storage device 110 as evidence images. Store (step S407). And the control part 123 complete
- FIG. 9 is a diagram for explaining a combination of evidence images stored in the storage device 110.
- the evidence image of the input image 1 is used as the evidence image.
- the combination 2 corresponds to 3 digits (digits 1 to 3) in the input image 1 where there is no input image in which the partial areas corresponding to all the digits (digits 1 to 4) are clear.
- a combination when it is determined that only the partial area is clear is shown.
- the evidence image of the input image 1 and the evidence image of the input image 2 in which the partial area corresponding to the digit 4 that is not determined to be clear in the input image 1 is clear are evidence. Used as an image.
- the combination 4 indicates a combination in the case where there is no input image in which the partial areas corresponding to the digits 3 and 4 that are not determined to be clear in the input image 1 are determined to be clear.
- the evidence image of the input image 3 is used as the evidence image.
- Combination 5 indicates a combination when there is no input image determined that the partial area corresponding to 2 digits is clear.
- the evidence images of the input images 1 to 4 that are determined to have clear partial areas corresponding to the digits 1 to 4 are used as evidence images.
- the control unit 123 uses, as evidence images, the evidence images corresponding to any of the input images determined to be clear for each of the partial regions corresponding to all the digits. use.
- the control unit 123 uses, as an evidence image, an evidence image of another input image that complements a partial region that has not been determined to be clear in an input image in which the evidence image is used as an evidence image.
- the control unit 123 stores a plurality of images having evidence ability even when there is no input image in which all the partial areas are determined to be clear, thereby enabling an appropriate image as an evidence image. Can be stored efficiently.
- the user does not need to continue photographing the meter until an input image determined that all the partial areas are clear is generated, and the image processing apparatus 100 can improve user convenience. .
- the image processing apparatus 100 does not need to store images related to all input images used in the numerical value recognition process, and can reduce the storage capacity of the storage device 110.
- the control unit 123 causes the imaging device 104 to image the meter until evidence images in which the partial areas corresponding to all the digits are clear are prepared.
- the input image may be continuously generated.
- the control unit 123 determines whether each partial area is clear for the newly generated input image. Continue the determination.
- the control unit 123 causes the imaging device 104 to stop the imaging of the meter and ends the evidence image storage process. As a result, the image processing apparatus 100 can acquire an evidence image that can be visually identified more reliably.
- control unit 123 when the numerical image recognition unit 121 recognizes the numerical value in the meter, when the evidence image in which the partial areas corresponding to all the digits are clear is not prepared within a predetermined time, the meter by the user A shooting stop instruction for instructing to stop shooting may be received.
- the control unit 123 displays display data such as a button for accepting a photographing stop instruction from the user on the display device 103.
- the control unit 123 receives the shooting stop instruction.
- the control unit 123 selects, for each digit, a partial region determined to be the clearest from the partial regions corresponding to each digit, and includes each selected partial region.
- the evidence image of each input image is used as the evidence image.
- the control unit 123 for example, has the clearest partial area with the smallest degree of blur output from each classifier, the partial area with the largest average value of edge strength, or the partial area with the smallest average value of half width. Select as a partial area.
- control unit 123 outputs pixels that are distributed in the partial region with the smallest degree of shine output from each classifier, the number of pixels having a luminance value equal to or higher than the seventh threshold value, or the region equal to or higher than the seventh threshold value in the histogram.
- the partial area having the smallest number is selected as the partial area that is the clearest.
- the image processing apparatus 100 uses, for each digit of the numerical value in the meter, an evidence image corresponding to the input image determined to have a clear partial area corresponding to each digit as an evidence image.
- the image processing apparatus 100 can efficiently store an appropriate image as an evidence image.
- FIG. 10 is a block diagram showing a schematic configuration of the processing circuit 230 in the image processing apparatus according to another embodiment.
- the processing circuit 230 is used instead of the processing circuit 130 of the image processing apparatus 100, and executes the entire processing instead of the CPU 120.
- the processing circuit 230 includes a numerical value recognition circuit 231, a determination circuit 232, a control circuit 233, and the like.
- the numerical value recognition circuit 231 is an example of a numerical value recognition unit, and has the same function as the numerical value recognition unit 121.
- the numerical value recognition circuit 231 sequentially acquires input images obtained by photographing the meter from the imaging device 104 and sequentially stores them in the storage device 110.
- the numerical value recognition circuit 231 identifies and counts the numerical values in the meter shown in each input image, recognizes the numerical values in the meter based on the totaled results, and stores the recognition results in the storage device 110.
- the determination circuit 232 is an example of a determination unit and has the same function as the determination unit 122.
- the determination circuit 232 determines whether or not each partial area is clear for each partial area corresponding to each digit of the numerical value in the meter, and outputs the determination result to the control circuit 233.
- the control circuit 233 is an example of a control unit and has the same function as the control unit 123. For each digit, the control circuit 233 selects an input image for which the partial area corresponding to each digit is determined to be clear, and uses the evidence image corresponding to the selected input image as the evidence image for the numerical value recognition circuit 231. Are stored in the storage device 110 in association with each other.
- the image processing apparatus 100 can efficiently store an appropriate image as an evidence image even when the processing circuit 230 is used.
- each discriminator used in the partial area detection process, the numerical value recognition process, or the evidence image storage process may be stored in an external device such as a server device instead of being stored in the storage device 110.
- the numerical value recognition unit 121 transmits each image to the server apparatus via the communication apparatus 101, and receives and acquires the identification result output from each classifier from the server apparatus.
- the image processing apparatus 100 is not limited to a portable information processing apparatus, and may be, for example, a fixed point camera or the like installed so that a meter can be imaged.
Landscapes
- Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Character Discrimination (AREA)
Abstract
La présente invention concerne un dispositif de traitement d'images, un procédé de commande et un programme de commande qui permettent de stocker efficacement une image appropriée en tant qu'image de preuve. Un dispositif de traitement d'images comprend : une unité de stockage ; une unité de prélèvement d'image pour générer séquentiellement des images d'entrée en prenant des images d'un compteur ; une unité de reconnaissance de valeur numérique pour spécifier et compiler des valeurs numériques dans le compteur dans les images d'entrée générées séquentiellement, et sur la base du résultat de la compilation, reconnaître les valeurs numériques dans le compteur ; une unité de détermination pour déterminer, pour chaque zone partielle correspondant à chaque chiffre des valeurs numériques dans le compteur, si chaque zone partielle est vide ou non ; et une unité de commande pour sélectionner, pour chaque chiffre, une image d'entrée dans laquelle il est déterminé que la zone partielle correspondant à chaque chiffre est vide, et stocker, dans l'unité de stockage, au moins une partie de l'image d'entrée sélectionnée en tant qu'image de preuve avec la ou les parties de l'image d'entrée sélectionnée associées aux valeurs numériques reconnues par l'unité de reconnaissance de valeur numérique.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019505670A JP6707178B2 (ja) | 2017-03-17 | 2017-03-17 | 画像処理装置、制御方法及び制御プログラム |
| PCT/JP2017/011039 WO2018167974A1 (fr) | 2017-03-17 | 2017-03-17 | Dispositif de traitement d'images, procédé de commande et programme de commande |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2017/011039 WO2018167974A1 (fr) | 2017-03-17 | 2017-03-17 | Dispositif de traitement d'images, procédé de commande et programme de commande |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018167974A1 true WO2018167974A1 (fr) | 2018-09-20 |
Family
ID=63523497
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/011039 Ceased WO2018167974A1 (fr) | 2017-03-17 | 2017-03-17 | Dispositif de traitement d'images, procédé de commande et programme de commande |
Country Status (2)
| Country | Link |
|---|---|
| JP (1) | JP6707178B2 (fr) |
| WO (1) | WO2018167974A1 (fr) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020052981A (ja) * | 2018-09-28 | 2020-04-02 | 株式会社東芝 | 情報処理装置、学習装置、情報処理システム、情報処理方法及びコンピュータプログラム |
| JP2020087397A (ja) * | 2018-11-19 | 2020-06-04 | 奇邑科技股▲ふん▼有限公司 | 知能情報読み取り方法、装置とシステム |
| JP2021012665A (ja) * | 2019-07-09 | 2021-02-04 | 三菱重工業株式会社 | 指示値読取システムおよび方法並びにプログラム |
| WO2022014145A1 (fr) * | 2020-07-14 | 2022-01-20 | ダイキン工業株式会社 | Dispositif de traitement d'image, système de traitement d'air, programme de traitement d'image et procédé de traitement d'image |
| JP2022091065A (ja) * | 2020-12-08 | 2022-06-20 | トヨタ自動車東日本株式会社 | 色判別装置及び色判別方法 |
| JP2023011289A (ja) * | 2021-07-12 | 2023-01-24 | 国立大学法人 鹿児島大学 | 文字認識装置及び文字認識プログラム |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH01177178A (ja) * | 1988-01-04 | 1989-07-13 | Sumitomo Electric Ind Ltd | 文字認識装置 |
| JP2007011654A (ja) * | 2005-06-30 | 2007-01-18 | Seiko Epson Corp | エンボス文字読取方法 |
| WO2015015535A1 (fr) * | 2013-07-31 | 2015-02-05 | グローリー株式会社 | Système de gestion de factures, appareil de de gestion de factures, et procédé de gestion de factures |
| JP5879455B1 (ja) * | 2015-10-16 | 2016-03-08 | 株式会社ネフロンジャパン | 水道検針装置 |
-
2017
- 2017-03-17 JP JP2019505670A patent/JP6707178B2/ja active Active
- 2017-03-17 WO PCT/JP2017/011039 patent/WO2018167974A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH01177178A (ja) * | 1988-01-04 | 1989-07-13 | Sumitomo Electric Ind Ltd | 文字認識装置 |
| JP2007011654A (ja) * | 2005-06-30 | 2007-01-18 | Seiko Epson Corp | エンボス文字読取方法 |
| WO2015015535A1 (fr) * | 2013-07-31 | 2015-02-05 | グローリー株式会社 | Système de gestion de factures, appareil de de gestion de factures, et procédé de gestion de factures |
| JP5879455B1 (ja) * | 2015-10-16 | 2016-03-08 | 株式会社ネフロンジャパン | 水道検針装置 |
Cited By (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020052981A (ja) * | 2018-09-28 | 2020-04-02 | 株式会社東芝 | 情報処理装置、学習装置、情報処理システム、情報処理方法及びコンピュータプログラム |
| JP2020087397A (ja) * | 2018-11-19 | 2020-06-04 | 奇邑科技股▲ふん▼有限公司 | 知能情報読み取り方法、装置とシステム |
| JP2021012665A (ja) * | 2019-07-09 | 2021-02-04 | 三菱重工業株式会社 | 指示値読取システムおよび方法並びにプログラム |
| JP7187394B2 (ja) | 2019-07-09 | 2022-12-12 | 三菱重工業株式会社 | 指示値読取システムおよび方法並びにプログラム |
| WO2022014145A1 (fr) * | 2020-07-14 | 2022-01-20 | ダイキン工業株式会社 | Dispositif de traitement d'image, système de traitement d'air, programme de traitement d'image et procédé de traitement d'image |
| JP2022018021A (ja) * | 2020-07-14 | 2022-01-26 | ダイキン工業株式会社 | 画像処理装置、空気処理システム、画像処理プログラム、及び画像処理方法 |
| JP7014982B2 (ja) | 2020-07-14 | 2022-02-15 | ダイキン工業株式会社 | 画像処理装置、空気処理システム、画像処理プログラム、及び画像処理方法 |
| CN116157832A (zh) * | 2020-07-14 | 2023-05-23 | 大金工业株式会社 | 图像处理装置、空气处理系统、图像处理程序及图像处理方法 |
| US11810328B2 (en) | 2020-07-14 | 2023-11-07 | Daikin Industries, Ltd. | Image processing device, air treatment system, image processing program, and image processing method |
| JP2022091065A (ja) * | 2020-12-08 | 2022-06-20 | トヨタ自動車東日本株式会社 | 色判別装置及び色判別方法 |
| JP2023011289A (ja) * | 2021-07-12 | 2023-01-24 | 国立大学法人 鹿児島大学 | 文字認識装置及び文字認識プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2018167974A1 (ja) | 2019-07-25 |
| JP6707178B2 (ja) | 2020-06-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6707178B2 (ja) | 画像処理装置、制御方法及び制御プログラム | |
| US8494268B2 (en) | Method and arrangement for retrieving information comprised in a barcode | |
| US9721532B2 (en) | Color chart detection apparatus, color chart detection method, and color chart detection computer program | |
| US7349588B2 (en) | Automatic meter reading | |
| US20150264324A1 (en) | Image processing device, image capture device, image processing method, and image processing program | |
| JP2014195148A (ja) | 画像処理装置、領域決定方法及びコンピュータプログラム | |
| JP6789410B2 (ja) | 画像処理装置、制御方法及び制御プログラム | |
| CN107404721A (zh) | 物联网设备配网方法、图像采集方法及设备 | |
| US8718370B2 (en) | Optical information-reading apparatus and optical information-reading method | |
| CN106650583B (zh) | 人脸检测方法、装置及终端设备 | |
| US7835552B2 (en) | Image capturing apparatus and face area extraction method | |
| WO2018167971A1 (fr) | Dispositif de traitement d'image, procédé de commande, et programme de commande | |
| KR101559338B1 (ko) | 카메라 모듈용 결함 픽셀 평가 시스템 및 이를 사용한 카메라 모듈용 결함 픽셀 평가 방법 | |
| JP2008077430A (ja) | 移動体計数装置および移動体計数方法 | |
| CN104966060A (zh) | 一种运动物体的目标识别方法和装置 | |
| JP2014191685A (ja) | 画像処理装置および画像処理方法 | |
| JP2009017158A (ja) | カメラ検査装置 | |
| JP2020003878A (ja) | マーカおよび画像処理装置 | |
| JP2019220069A (ja) | カード番号認識装置およびカード番号認識方法 | |
| CN110786009B (zh) | 检测贝尔图像方法、设备、机器可读存储介质 | |
| JP6851337B2 (ja) | 撮像装置、制御方法及び制御プログラム | |
| JP2012133587A (ja) | 画像解析装置、画像解析方法及びプログラム | |
| US10051232B2 (en) | Adjusting times of capture of digital images | |
| JP4312185B2 (ja) | ゲーム用マット、カードゲームシステム、画像解析装置、画像解析方法 | |
| JP5931248B1 (ja) | 検査システム及び検査方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17901047 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2019505670 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17901047 Country of ref document: EP Kind code of ref document: A1 |