[go: up one dir, main page]

GB2371737A - Slicing machine - Google Patents

Slicing machine Download PDF

Info

Publication number
GB2371737A
GB2371737A GB0201954A GB0201954A GB2371737A GB 2371737 A GB2371737 A GB 2371737A GB 0201954 A GB0201954 A GB 0201954A GB 0201954 A GB0201954 A GB 0201954A GB 2371737 A GB2371737 A GB 2371737A
Authority
GB
United Kingdom
Prior art keywords
window
image
product
camera
windows
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0201954A
Other versions
GB2371737B (en
GB0201954D0 (en
Inventor
Colin Michael Burton
Geoffrey Thomas Carruth
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AEW Engineering Co Ltd
Original Assignee
AEW Engineering Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0102301A external-priority patent/GB0102301D0/en
Priority claimed from GB0102300A external-priority patent/GB0102300D0/en
Application filed by AEW Engineering Co Ltd filed Critical AEW Engineering Co Ltd
Publication of GB0201954D0 publication Critical patent/GB0201954D0/en
Publication of GB2371737A publication Critical patent/GB2371737A/en
Application granted granted Critical
Publication of GB2371737B publication Critical patent/GB2371737B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • A22C17/0006Cutting or shaping meat
    • A22C17/0033Cutting slices out of a piece of meat
    • AHUMAN NECESSITIES
    • A22BUTCHERING; MEAT TREATMENT; PROCESSING POULTRY OR FISH
    • A22CPROCESSING MEAT, POULTRY, OR FISH
    • A22C17/00Other devices for processing meat or bones
    • A22C17/0073Other devices for processing meat or bones using visual recognition, X-rays, ultrasounds, or other contactless means to determine quality or size of portioned meat
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26DCUTTING; DETAILS COMMON TO MACHINES FOR PERFORATING, PUNCHING, CUTTING-OUT, STAMPING-OUT OR SEVERING
    • B26D5/00Arrangements for operating and controlling machines or devices for cutting, cutting-out, stamping-out, punching, perforating, or severing by means other than cutting
    • B26D5/007Control means comprising cameras, vision or image processing systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26DCUTTING; DETAILS COMMON TO MACHINES FOR PERFORATING, PUNCHING, CUTTING-OUT, STAMPING-OUT OR SEVERING
    • B26D7/00Details of apparatus for cutting, cutting-out, stamping-out, punching, perforating, or severing by means other than cutting
    • B26D7/27Means for performing other operations combined with cutting
    • B26D7/30Means for performing other operations combined with cutting for weighing cut product
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • G01N33/12Meat; Fish
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B26HAND CUTTING TOOLS; CUTTING; SEVERING
    • B26DCUTTING; DETAILS COMMON TO MACHINES FOR PERFORATING, PUNCHING, CUTTING-OUT, STAMPING-OUT OR SEVERING
    • B26D2210/00Machines or methods used for cutting special materials
    • B26D2210/02Machines or methods used for cutting special materials for cutting food products, e.g. food slicers

Landscapes

  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Food Science & Technology (AREA)
  • Forests & Forestry (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Analytical Chemistry (AREA)
  • Medicinal Chemistry (AREA)
  • Image Processing (AREA)

Abstract

A slicing machine for cutting slices of food product includes a camera (1 fig 2) viewing the cut face of the food product 2 and providing an image on a VDU, a window or windows 8, 9, 10, 11 being superimposed on said image for an image processor to process image signals within said window(s) and a computerised control system controlling the machine in response to the processed image signals, the size and position of said window or windows being at least partially set by the operator using a manually operable control device, eg a mouse. The operator may set the corner points of the window(s). The bottom and left borders 9, 10 may be set to align with the shear edges 3, 4 against which the food product abuts. One or more smaller windows (19, 20 fig 6) may be defined around areas of particular interest on the cut face, eg areas of lean meat and/or fat (17, 18 fig 6), for further image processing. Perspective correction means may be used to correct image data from the camera before or after processing, eg if the cut face is inclined relative to the camera axis (fig 2).

Description

Title: Improvements in Slicing Machines Field of the Invention This invention relates to a slicing machine for cutting slices from a food product, such as cheese or meat, but especially bacon or similar meat products, containing regions of fat and regions of lean meat.
Background to the Invention Typically such a slicing machine includes a rotating blade and means to feed the product forward towards the blade so that successive slices are cut from one face of the product.
The distance through which the product is advanced between successive cuts of the blade determines the thickness of the slices. Where the product is of uniform shape and density then it may be sufficient to use a single predetermined slice thickness to give a slice or group of slices of the required weight. In general however, variations in the shape and density of the product mean that the weight of a slice of a given thickness varies. A previous approach to dealing with this variation is described and claimed in the applicant's granted European Patent EP-B-0, 127,463. This patent describes and claims a process in which an automatic slicing machine is programmed to vary the thickness of the slices in accordance with a typical weight distribution for the product. Although this system achieves good results where the product shape or envelope varies in a predictable manner it still tends to produce a number of slices which are outside the required weight range when the actual weight density distribution departs from the expected distribution.
Prior Art The present applicant's US Patent No. 5267168 describes a method of controlling a slicing machine, more especially to enable slices of equal weight to be cut in spite of shape and density variations in the cut face of the product, in which a camera views the cut face of the product and image data from the camera is processed to determine a parameter characteristic of the cut face, a computer control system controlling operation of the slicing machine accordingly. In practice, the characteristic parameter is computed from image data obtained within a window within the field of view of the camera. This window could, for example, in order to determine the area of the cut face, be set to align at the bottom and left side border with shear edges against which the blade presses when slicing, the window embracing the product at the top and right hand borders. The purpose of setting up the window and processing only image data obtained from within the window is to save data processing time. However, the procedure for setting up the window is a lengthy trial and error process which requires successive rounds of data entry and data saving appertaining to the co-ordinates of the comers of the window or windows, each round followed by inspection of the display, eventually, for example in the case of a window embracing the image of the cut face on the VDU, to align the window with the above mentioned shear edges.
The Invention According to the invention, there is provided a slicing machine for cutting slices from a food product, comprising a camera which views a cut face of the product, a visual display device (VDU) on which an image of the cut face is displayed to an operator, a computerised control system including an image processor, for defining a window or windows superimposed on the said image on the VDU, a set size and position of the window or windows being input to the image processor to enable the processor to process image signals within the window or windows, and a manually operable control device controlled by the operator at least partially to set the size and position of the window or windows, the computerised control system controlling the action of the slicing machine responsively to the processed image signals obtained from within the window or windows.
Preferably the manually operable control device is a mouse, with the aid of which the operator is able to set the comer points of the window one after the other, changing the length and position of the border lines as appropriate, and then setting the co-ordinates of the comer points successively by clicking the left mouse button. Thus, for example, in the case of a window embracing the image of the product on the VDU, the operator is readily able to set the bottom and left hand borders of the window to align with the shear edges against which the product abuts in use.
The mouse can also be used to set up a further smaller window (or windows), for example embracing one or more areas of lean meat, in the case of bacon, in the image of the product on the VDU, so as to enable image signals within the smaller window or windows to be processed. Taking account of the results of processing signals obtained from within the further window or windows enables more accurate grading signals to be generated for the slices.
It will thus be apparent that the operator is able to set the position and shape of the window at one attempt, avoiding the need for repetitive entry and data saving. This not only makes the time required to set up the window much shorter, but also makes it much easier.
In use it may not be appropriate or convenient to position a camera so as to view the end face of a product log on axis, and if the camera is positioned off axis the axis of the product log will be inclined, typically by about 20 degrees, to the axis of the camera.
Therefore in accordance with a preferred feature of the invention perspective correction is applied to correct the signals obtained from the camera before or during their processing, to correct for any off-axis viewing of the product by the camera, which will result in the apparent shape and dimensions of an object varying, depending on where it is situated in the camera field of view. Thus for example a rectangular area in the plane normally occupied by the end face of the product log will appear trapezoidal in the image formed in the camera. This distortion is sometimes referred to as viewing angle distortion and it is know to apply perspective correction to image data obtained using a camera which views at an off-axis angle, so as to reduce the sizing errors which can otherwise arise by counting pixels. Thus for example if the area of an object is computed by summing the camera signal pixels contained within the edge or boundary of the image of the object as it appears on a camera CCD, an incorrect area measurement for the object can result, if a correction factor is not applied.
Typically the camera image signal is in, or is converted to, a digital format, each digital value numerically representing the charge depletion of a tiny region (pixel) in the CCD, caused by the light incident thereon in the image formed on the CCD. When in this form, the image data obtained by a single read-out of the CCD can be stored in a digital memory device, known as a frame store.
Applying a threshold so as to separate digital values according to numerical value, allows pixel counting to be limited to pixels whose grey level (i. e. position in the scale between black and white) is above (or below) one value (determined by the selection of the detection threshold value). Thus the area of (say) a region whose colour allows its pixels to be separated from its surroundings in the image, can be estimated by counting the pixels whose values satisfy a detection threshold which distinguishes their grey level values from those of its surroundings. This allows an area of darker lean meat to be distinguished relative to surrounding lighter coloured fat.
The perspective correction may be applied to the stored image data in such a way as to produce a modified set of digital values, which if displayed will produce a representation of the image which would have been formed on the camera CCD if the camera had been viewing the object with the latter in a plane perpendicular to the optical axis of the camera lens system-i. e. on-axis, and the thresholding and area computation is performed on the modified digital values. Alternatively the unmodified stored digital values may be subjected to the same thresholding so as to separate pixels corresponding to lighter and darker areas, but depending on where any particular pixel lies in relation to the array of pixels determined by the CCD, a weighting (or correction factor) is applied to it (using pre-stored information) so that if its digital value satisfies the grey level threshold, its contribution to the area count of similar adjoining pixels also satisfying the threshold, will be adjusted to compensate for the off-axis viewing angle.
Where perspective correction has been applied to the image data, an accurate estimate of the area of any group of threshold satisfying pixels will be obtained.
In one preferred method perspective correction is performed by modifying the digital image data values using a mathematical algorithm, values for variable components of which are obtained by a calibration process. The latter typically involves the step of placing a rectangular set-up scale of known dimensions in the camera field of view in the same plane and position which will be occupied by the end face of the product log, and inputting the true dimensions of each of the edges of the scale to a computer to which uncorrected image data from the camera which is viewing the scale are also fed. Scaling the inputed dimensions to take account of de-magnification in the camera optics allows numerical values for the edges of the scale to be stored for future comparison.
If the uncorrected image data is used to produce an image in a VDU display of the image presented to the camera CCD, a mouse or other pointing device associated with the computer can be used to identify the end points and therefore lengths of each of the edges of the image of the scale in the VDU display, and by comparing the dimensions of the displayed edge with the stored correct length value for that edge, using the computer, the latter can compute a correction factor to be applied in that region of the camera image to adjust the image data (or size data derived therefrom) to obtain a corrected dimension in that region. Repeating the process for each of the edges of known dimensions enables an algorithm to be formulated which can be used to interpolate for pixels which do not lie on the edges of the scale, so that data from all pixels, at least within the area of the scale, can subsequently be corrected, or a correction factor value can be computed for every one of the pixel locations or for selected pixel positions throughout the CCD arrays, correction factor values for pixel locations between those for which correction factors are stored being computed by interpolation, so that image data obtained during subsequent scans can be corrected using an appropriate correction factor as data relating to each pixel is processed.
In a particularly preferred arrangement the computer may for example use the co-ordinates of the pixels at the comers of the image of the rectangular scale (identified by using a mouse or other pointing device) and the known true size of the edges of the scale, to compute a correction factor for those comer co-ordinates, and by interpolation, for all other pixel co-ordmate positions at least within the area of the scale. The correction factors can be stored in a memory organised as a look-up table which is then addressed according to pixel position to correct the digital values obtained by reading out the camera CCD when the scale has been replaced by the end face of the product log and it is an image of the latter which is formed on the CCD, in place of the scale By using the stored correction factors from the look-up table memory, as required, the digital picture signals stored in the frame store may be modified so as to produce in a VDU a corrected image, corresponding to what would have been focused on the camera CCD if the camera had viewed the cut end face on-axis. By doing so, the shape of regions of darker lean meat and lighter coloured fat in the end face of a side of bacon, will be correctly reproduced in the VDU display. If it is desired to circumscribe one of the regions using a computer generated window using a mouse or other pointing device, the accurate positioning and proportioning of the window is thereby facilitated.
Perspective correction techniques are described in Digital Image Processing 1993 ISBN 0201-600-76-1, Section 25.2, entitled Perspective transformations (Rafael Goncales and Richard E Woods). According therefore to another aspect of the invention, there is provided a slicing machine for cutting slices from a food product, comprising a camera which views a cut face of the product with the cut face inclined relative to a plane normal to the camera axis, and a frame store for storing digital image data from the camera, a visual display device (VDU) on which an image of the cut face is displayed to an operator, a computerised control system including an image processor, for defining a window or windows superimposed on the said image on the VDU, a set size and position of the window or windows being input to the image processor to enable the processor to process image signals within the window or windows, and a manually operable control device controlled by the operator at least partially to set the size and position of the window or windows, the computerised control system controlling the action of the slicing machine responsively to the processed image signals obtained from within the window or windows, and the image processor incorporating perspective correction means for correcting the stored image data to compensate for the viewing angle.
The perspective correction used where the cut face is inclined to a plane normal to the axis of the camera, is employed to correct for the non-rectangularity of the image caused by the inclination.
If the camera is on-axis but merely angled so as to look up or down, the correction is only required in one plane (i. e. in a vertical sense).
If the camera views the product from above (or below) and from one side or the other, the correction is needed in two planes, i. e. both vertical and horizontal.
The perspective correction means typically applies to the image data signals a correction factor determined by pixel counts of dimensions of a known object located in the machine in place of the product during a calibration step as aforesaid. Thus the correction factors for each pixel position may be computed during a calibration step in which the cut face is replaced by a rectangular set-up scale of known dimensions. In the image of the set-up scale, the edges will be distorted due to perspective effects, but the true dimensions of the edges of the scale can be identified to the processor, thereby enabling correction factors for pixel co-ordinates throughout the field of view to be determined, and stored for future reference either in a look-up table memory or in the form of an algorithm whose functionality varies with pixel co-ordinate for adjusting image data values during subsequent scans.
Description of embodiment In the accompanying drawings: Fig 1 shows a view on a display monitor of the product m a slicing machine, as seen by a viewing camera ; Fig 2 is a view of part of the slicing machine having an LED mounted on a top roll; Fig 3 illustrates use of the LED to set a top roll cut-off line; Fig 4 is a view of part of a slicing machine having a movable side guide and associated LED; Fig 5 illustrates use of the side guide to set a side cut-off line, and Fig 6 shows use of the viewing system to set up windows, embracing features of interest in the product.
In the present invention, the basic slicing machine is generally similar to that described in US Patent No. 5267168. The product is held down by a top roll on the bed of a slicing machine against a vertical side plate. A camera views the end face of the product and the image data is used to calculate characteristic features of the end face, as seen by the camera, which in turn are used to control the thickness of slice to be cut by a slicing knife. The speed with which the calculations can be made affects the overall speed of working of the machine and, for this reason, the US Patent discloses the setting up of a window within the field of view of the camera, which embraces the end face of the product, the calculations being made only on image data derived from within the window.
However, in the known machine, window set up is a lengthy process, and the window is not accurately defined at a top edge and a side edge, having regard to the fact that, during slicing, the height and width of the product can change along its length. In addition, inadequate correction is made for the fact that the camera views the product end face at an angle, typically about 20 degrees.
Easier and more accurate definition of one or more viewing windows would enable better control of the slicing knife and also enable the machine to operate at a higher speed.
Referring to Fig 1, in the machine of the present invention, during window set up, the cutoff lines which define the window along the side of interest are positioned. The lines 8,9, 10 and 11 are superimposed over the camera image, as shown on a display monitor, by the viewing system.
In the known machine, numerical values for the co-ordinates of the quadrilateral, which is non-rectangular owing to the angle of view, are entered, to define the window of interest.
This was a trial and error process which required successive rounds of data entry and solving, followed by inspection of the display, eventually to superimpose the quadrilateral on the shear edges which restrain the product.
In the machine of the present invention, the use of a mouse, or similar manual control device, enables the line 9 to be aligned with the bottom shear edge 4, as seen on the viewing monitor, and likewise enables the line 10 to be aligned with the left-hand shear edge 3. Line 11 can then be set to the maximum width of the product and line 8 is the maximum height of the product. The setting of the lines in this way, by means of a mouse, is much easier and quicker than entering x, y co-ordinates in a table in a trial and error fashion, because the operator has immediate visual feedback and is able to set up the window correctly first time.
In the slicing machine of the present invention, as in the slicing machine of the US Patent, the top roll which bears down on the product is the last section of the drive which imparts a forward motion to the product during slicing. It is conventionally heavily knurled or spiked for this purpose. A side effect is that the knurling or spikes can collect small pieces of product, making it difficult for the viewing system to distinguish between the product on the top roll and the mam product body. Although software techniques have been proposed ill an endeavour to deal with this problem, this depends on evaluation of changing grey levels and thus subject to jitter, resulting in errors in thickness evaluation. The software techniques also take up processing time and thus slow down overall speed of operation of the machine.
In the machine of the present invention, making reference to Fig 2, the camera 1 views the product 2 located on the machine bed, enabling the slicing knife in the form of a rotating blade (not shown) to slice the product against the vertical shear edge 3 and the horizontal shear edge 4. The top roll 6 which drives the product forwardly is mounted on a tie bar and on this is mounted an LED 7. The top bar assembly is able to move up and down as the height of the product changes during slicing. Referring now to Fig 3, showing the view from the display monitor, the top line 8 of the window is positioned just below the top roll.
The distance between the image of the LED 7 and the line 8 is kept constant by the viewing system. As the top roll 6 lifts and lowers to follow a changing height of the product, the cut-off line 8 follows it dynamically. The use of the LED 7 dynamically to set the top line of the window embracing the product results in a more accurate evaluation of slice thickness and faster processing of the image, enabling machine operation at a higher speed.
In addition, the machine of the present invention preferably includes an LED to set up a dynamic right hand side line to the window area. This is beneficial in reducing wasted material at the end of the product log. Thus, referring to Fig 4, a machine side guide 13 is pressed against the right hand side of the product, typically by an air cylinder 14. The LED 12 is mounted on the side guide. As the guide moves in and out due to variations in product width, the LED 12 moves with it. The setting up of the side guide is illustrated in Fig 5. The product is located on the bottom shear edge 4 against the left hand shear edge 3. The vertical side cut off line 15 is positioned using the mouse to coincide with the lefthand side of the side guide which is in contact with the product. In use, the viewing system tracks the position of the LED 12 dynamically to maintain the position of the cut-off line 15 at a fixed distance to the left of the LED. The advantages obtained are analogous to those obtained by use of a top roll LED as hitherto described, bearing in mind that the side guide is also likely to become coated with pieces of product which the viewing system cannot readily distinguish from the main product.
Thus, in the machine of the present invention, the quadrilateral defining the window of interest embracing the product is in use completely defined by the left hand and bottom shear edges (fixed), the dynamic top line set by the top roll LED and the dynamic right hand side line set by the side guide LED.
The machine in accordance with the present invention is also able, by means of the mouse, to define particular regions of interest for analysis. Two or more particular regions may be defined; for instance two defined regions may be called the primary region and the secondary region. Thus, in bacon slicing for example, grading of the slices can be set up by using the primary region to define the darker primary lean region of the meat and using the secondary region to define the lighter coloured secondary lean region of the meat. The regions are set up simply by use of the mouse to drag the sides of the defming rectangles, generated by the viewing system superimposed on the overall image of the product.
Immediate feedback to the viewing monitor enables the operator to set the rectangles correctly first time. The image data within the defined regions is analysed for lean meat content during slicing and enables improved grading of the product, for example so that lean meat appears in a particular place in a final sliced pack for shop display purposes.
This added facility is advantageous compared with a machine which only examines the entire slice area for lean/fat content, and affords flexibility and adaptability for the needs of the end-user for grading a wide range of product shapes. Referring to Fig 6, there is shown the viewing operator's monitor display set up for two special lean/fat assessment areas of interest. The two areas of interest are marked 17 and 18, and the first rectangle 19 is set up to enable assessment of the darker primary lean meat 17 and the second rectangle 20 is set up to enable assessment of the lighter secondary lean meat 18 contained within the product 2. The rectangles are quickly and easily set by dragging their sides into position using the mouse control. For different products and/or uses, the rectangles can readily be redefined. Although US Patent No. 5267168 uses an analysis of grey levels over the whole field to determine the lean to background grey level threshold, a process which may be termed autothresholding, the present invention enables autothresholding to be carried out specifically in relation to areas of interest on which the above-described windows 19 and 20 are superimposed.
Overall, the viewing system herein described enables grading taking into account a variety of parameters, including the slice height at the highest point, the percentage of slice width above a given height, the slice width at the widest point, lean meat distribution using statistical variance, and lean meat lengths and positions in the primary and secondary regions of interest.
The machine in accordance with the invention also provides an improved means of perspective correction, which occurs because the cut face of the product is at an angle to the axis of the camera. This means that the pixels representing the near part of the product represent a smaller area than those further away.
The correction parameters are obtained by placing an accurately delineated rectangular scale on the vertical shear edge in place of the product. This is sized to represent the maximum possible height of a product. The number of pixels along each edge of the image of the scale are compared, the edges having been defined to the viewing system using the mouse.
The quadrilateral defining the perspective set up scale is thus :- 1) the fixed left line coinciding with the left hand shear edge; 2) the right hand edge of the scale; 3) the fixed top line coinciding with the top of the scale; and 4) the bottom shear edge.
Using the above described measurements as data, in use the system calculates the area of the product that each pixel represents to obtain a correction factor, and then uses this correction factor to modify all areas and lengths which are taken into account during grading and slicing of the product. This true perspective correction of every pixel is much more accurate than any previously proposed method of perspective correction and is made possible by the use of faster processing techniques now available.
Thus in practice, the improved viewing system enables complete assessment of the determined parameters of the end face of the product being viewed while a single slice is being cut, and enables this assessment to be used for slicing control during cutting of the next slice.

Claims (13)

  1. CLAIMS 1. A slicing machine for cutting slices from a food product, comprising a camera which views a cut face of the product, a visual display device (VDU) on which an image of the cut face is displayed to an operator, a computerised control system including an image processor, for defining a window or windows superimposed on the said image on the VDU, a set size and position of the window or windows being input to the image processor to enable the processor to process image signals within the window or windows, and a manually operable control device controlled by the operator at least partially to set the size and position of the window or windows, the computerised control system controlling the action of the slicing machine responsively to the processed image signals obtained from within the window or windows.
  2. 2. A slicing machine according to claim 1 wherein the manually operable control device is movable to set the comer points of the window one after the other.
  3. 3. A slicing machine according to claim 2 wherein the manually operable control device is a mouse.
  4. 4. A slicing machine as claimed in claim 3 wherein the mouse is also movable to change the length and position of the border lines of the window as appropriate, and to set the co-ordinates of the comer points successively by clicking the left mouse button.
  5. 5. A slicing machine according to any of claims 1 to 4 further comprising perspective correction means adapted to correct the image data obtained from the camera before or during processing of the data.
  6. 6. A method of setting the size and position of a window superimposed on an image on a VDU of the cut face of a food product viewed by a camera which produces a signal for generating the image on the VDU, wherein the bottom and left hand borders of the window are set to align with the shear edges against which the product abuts in use.
  7. 7. A method as claimed in claim 6 further comprising the step of setting the size and position of a second smaller window on the image on the VDU, so as to embrace one particular area of the image of the product on the VDU, so as to enable image signals within the smaller window to be processed.
  8. 8. A method according to claim 7 wherein the food product is meat and the further smaller window is set so as to correspond to a region of lean meat within the cut face of the meat.
  9. 9. A method according to claim 7 or 8 wherein the size and position of a third window, also smaller than the first mentioned window, is also set up by the mouse on the image on the VDU so as to embrace a second particular area of the image of the product.
  10. 10. A method according to claim 9 wherein the second window is set to define a region of dark lean meat and the third window is set to define a region of lighter coloured lean meat.
  11. 11. A method according to claim 9 wherein the image signals within the second and third windows are subjected to auto-thresholding and the areas of darker and lighter lean meat are measured and a grading signal is generated in dependence on the measured volume of said areas to allow the slice of meat to be graded.
  12. 12. A method according to any of claims 6 to 11 further comprising the step of perspective correcting the image data to compensate for the viewing angle of the camera.
  13. 13. A method according to claim 12 wherein a video signal derived from the perspective corrected image data is employed to create the displayed image on the VDU.
GB0201954A 2001-01-30 2002-01-29 Improvements in slicing machines Expired - Fee Related GB2371737B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB0102301A GB0102301D0 (en) 2001-01-30 2001-01-30 Improvemnts in slicing machines
GB0102300A GB0102300D0 (en) 2001-01-30 2001-01-30 Improvements in slicing machines

Publications (3)

Publication Number Publication Date
GB0201954D0 GB0201954D0 (en) 2002-03-13
GB2371737A true GB2371737A (en) 2002-08-07
GB2371737B GB2371737B (en) 2004-08-11

Family

ID=26245647

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0201954A Expired - Fee Related GB2371737B (en) 2001-01-30 2002-01-29 Improvements in slicing machines

Country Status (2)

Country Link
GB (1) GB2371737B (en)
WO (1) WO2002060656A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006053509A1 (en) * 2004-11-17 2006-05-26 Csb-System Ag Data acquisition for classifying slaughtered animal bodies and for their qualitative and quantitative determination
DE102005013732A1 (en) * 2005-03-22 2006-10-05 Reifenhäuser, Uwe, Dipl.-Ing. Method and device for cutting string-shaped foods
DE102007021510A1 (en) * 2007-05-04 2008-11-06 Maja-Maschinenfabrik Hermann Schill Gmbh & Co. Kg Article e.g. meat, cutting device for use in food industry, has hand-hold device-drive moving upward or downward around hold-down device in straight line to increase or decrease integrity of conveying unit
DE202013004027U1 (en) 2013-04-30 2013-06-06 Foodlogistik Fleischereimaschinen Gmbh Device for cutting food into strips or cubes

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6997089B2 (en) 2002-06-25 2006-02-14 Formax, Inc. Optical grading system for slicer apparatus
GB0612246D0 (en) * 2006-06-21 2006-08-02 Aew Delford Systems Vision system using strobed illumination
US11844357B2 (en) 2017-12-19 2023-12-19 Horst Eger Optically assessing body properties

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0449514A1 (en) * 1990-03-27 1991-10-02 Thurne Engineering Co Ltd Slicing machine

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9006804D0 (en) * 1990-03-27 1990-05-23 Thurne Eng Co Ltd Slicing machine
US5668634A (en) * 1992-07-03 1997-09-16 Newman; Paul Bernard David Quality control and grading system for meat
NZ334675A (en) * 1996-08-23 2000-05-26 United Kingdom Government Method and apparatus for using image analysis to determine meat and carcass characteristics
DE19847232C2 (en) * 1998-05-19 2000-07-13 Csb Syst Software Entwicklung Method for evaluating halves of slaughter by optical image processing
WO2000049400A1 (en) * 1999-02-18 2000-08-24 Colorado State University Research Foundation Meat imaging system for palatability and yield prediction

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0449514A1 (en) * 1990-03-27 1991-10-02 Thurne Engineering Co Ltd Slicing machine

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006053509A1 (en) * 2004-11-17 2006-05-26 Csb-System Ag Data acquisition for classifying slaughtered animal bodies and for their qualitative and quantitative determination
EA014627B1 (en) * 2004-11-17 2010-12-30 Ксб-Зюстем Аг Data acquisition for classifying slaughtered animal bodies and for their qualitative and quantitative determination
US7929731B2 (en) 2004-11-17 2011-04-19 Csb-System Ag Data acquisition for classifying slaughtered animal bodies and for their qualitative and quantitative determination
DE102005013732A1 (en) * 2005-03-22 2006-10-05 Reifenhäuser, Uwe, Dipl.-Ing. Method and device for cutting string-shaped foods
DE102007021510A1 (en) * 2007-05-04 2008-11-06 Maja-Maschinenfabrik Hermann Schill Gmbh & Co. Kg Article e.g. meat, cutting device for use in food industry, has hand-hold device-drive moving upward or downward around hold-down device in straight line to increase or decrease integrity of conveying unit
DE202013004027U1 (en) 2013-04-30 2013-06-06 Foodlogistik Fleischereimaschinen Gmbh Device for cutting food into strips or cubes

Also Published As

Publication number Publication date
GB2371737B (en) 2004-08-11
GB0201954D0 (en) 2002-03-13
WO2002060656A1 (en) 2002-08-08

Similar Documents

Publication Publication Date Title
US6813389B1 (en) Digital image processing method and system including noise reduction and tone scale adjustments
AU2005200016B2 (en) Method and system for portioning workpieces to user-scanned shape and other specifications
US5267168A (en) Apparatus for and method of controlling slicing machine
US6275600B1 (en) Measuring image characteristics of output from a digital printer
US5054345A (en) Method of obtaining constant weight portions or slices of sliced food products
EP0218628B1 (en) Method for determining the color of a scene illuminant from a color image of the scene
US5324228A (en) Method and apparatus for detecting and trimming fat from meat products
EP1412920B1 (en) A general purpose image enhancement algorithm which augments the visual perception of detail in digital images
US5136906A (en) Slicing machine
JP3741474B2 (en) Bending order selection method and selection apparatus for bending machine
US20180143607A1 (en) Method and system for portioning workpieces using reference shape as a directly controlled characteristic
CN118543958A (en) Laser cutting path control method and system based on machine vision
GB2371737A (en) Slicing machine
US6301377B1 (en) Gel electrophoresis image warping
CN104999504A (en) Size measurement method of corrugated board and paper separation pressure line control method and system
GB2341922A (en) Surface vibration analysis
CN114820606A (en) Laser cutting equipment control system and method based on visual positioning
WO2002060657A1 (en) Slicing machine having an apparatus for setting the size o an image processor window
CN117495827A (en) Image evaluation method based on skin detector
US7062108B2 (en) Method for estimating the appearance of noise in images
JPH0929693A (en) Fixed weight cutter
CN105654499B (en) A kind of image evaluation method of laser surface modification
US11188049B2 (en) Method and system for portioning workpieces using reference shape as a directly controlled characteristic
Fernandes et al. Detection and quantification of microorganisms in a heterogeneous foodstuff by image analysis
WO1999054717A2 (en) Method and apparatus for identification of probable defects in a workpiece

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20190129