[go: up one dir, main page]

WO2025111689A1 - Devices, systems and methods for processing or propagating plants - Google Patents

Devices, systems and methods for processing or propagating plants Download PDF

Info

Publication number
WO2025111689A1
WO2025111689A1 PCT/CA2024/000015 CA2024000015W WO2025111689A1 WO 2025111689 A1 WO2025111689 A1 WO 2025111689A1 CA 2024000015 W CA2024000015 W CA 2024000015W WO 2025111689 A1 WO2025111689 A1 WO 2025111689A1
Authority
WO
WIPO (PCT)
Prior art keywords
wire
plant
image data
sensor
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/CA2024/000015
Other languages
French (fr)
Inventor
Mehrdad RAJI KERMANI
Moteaal ASADI SHIRZI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Western Ontario
Original Assignee
University of Western Ontario
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Western Ontario filed Critical University of Western Ontario
Publication of WO2025111689A1 publication Critical patent/WO2025111689A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G2/00Vegetative propagation
    • A01G2/30Grafting
    • A01G2/32Automatic apparatus therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J15/00Gripping heads and other end effectors
    • B25J15/02Gripping heads and other end effectors servo-actuated
    • B25J15/0206Gripping heads and other end effectors servo-actuated comprising articulated grippers
    • B25J15/0226Gripping heads and other end effectors servo-actuated comprising articulated grippers actuated by cams
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/776Validation; Performance evaluation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects

Definitions

  • the present invention relates to agriculture and horticulture, and more particularly to tasks that typically involve handling or inspection of individual plants.
  • Plant processing and propagation includes many labour intensive tasks, such as pruning, localized spraying, clipping, and the like.
  • clipping of plants is a time-consuming and laborious task.
  • Clipping is the task of putting a rubber band or plastic clip around a plant’s main stem and a supporting structure, such as a stake at a particular point along the stem to provide additional support to the plant.
  • the clipping task involves a human worker bending or kneeling on the floor while using two hands to stretch a rubber band or to put a plastic clip around the plant and the stake. This is a physically demanding and painstaking task, and most modem greenhouses still require a significant amount of manual labour to execute the clipping task.
  • a clipping device comprising: a passively rotating spool supporting and supplying a roll of wire; a feeder wheel pushing the wire through an entrance of a wire guide and an exit of a wire guide; a cam co-rotationally coupled to the feeder wheel; a rotational actuator driving rotation of the feeder wheel and the cam; a bender positioned proximal to the exit of the wire guide, the bender providing a strike surface for curving the wire into a circular clip; a cutter positioned proximal to the exit of the wire guide, the cutter providing an edge for cutting the wire; a lever having a first end abutting the cam, a second end positioning the cutter and the bender, and an intermediate pivot point; the lever following the cam to move from a first pivot position aligning the bender strike surface with wire pushed through the exit end to a second pivot position sweeping the cutter edge across the exit end to cut the wire.
  • a method for processing plants comprising: acquiring image data of a plant with an image sensor; analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identifying a target point in the plant based on the segmented image data; determining distance/depth from the image sensor to the target point in the plant; generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant.
  • a system for processing plants comprising: a memory configured to store image data; an image sensor configured to acquire image data of a plant; a processor configured to: analyze the acquired image data with a machine vision component trained to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identify a target point in the plant based on the segmented image data; determine distance/depth from the image sensor to the target point in the plant; a controller configured to generate a control signal based on the distance/depth and communicate the control signal to a robot to position the robot at the target point in the plant.
  • Figure 1A shows a clipping device with a clamp in an open position.
  • Figure IB shows a clipping device with a clamp in an closed position.
  • Figure 2 shows wire feeding components of the clipping device.
  • Figure 3 A shows a first view of wire curving components of the clipping device.
  • Figure 3B shows a second view of the wire curving components.
  • Figure 3C shows examples of various configurations of curved wire clips.
  • Figure 4A shows wire cutting components of the clipping device.
  • Figure 4B shows interaction of wire cutting components so that a rotating cam pivots a wire cutting lever to execute a cutting motion of a cutter.
  • Figure 5A-5D show clamping components of the clipping device formed as first and second opposing claw shaped arms that are pivoted toward each other by a linear actuator to transition from an open position (Fig. 5A) through intermediate positions (Fig. 5B and 5C) to a closed position (Fig. 5D).
  • Figure 6 shows connection of wire feeding components, wire curving components, wire cutting components, and clamping components in absence of actuators and supporting structures.
  • Figure 7 shows a mathematical model as it applies to the transverse grooves and the pulling of the wire to the feeder.
  • Figure 8 shows the relationship between the relative location of the Bender to the Wire Feeder and the diameter of the clip.
  • Figure 9 shows two different profiles of the Bender. Changing a linear translational position of surface (a) and changing an angular rotational position of surface (b).
  • Figure 10 shows a schematic that demonstrates a change in Cam profile causing a different shaping of the Clip.
  • Figure 11 shows variant wire curving components of the clipping device modified to attach a heat coil to the wire guide to heat a plastic wire passing through a bore formed in the wire guide.
  • Figure 12 shows a block diagram illustrating a first variant method for handling plants including machine vision algorithms and robot motion algorithms.
  • Figure 13 shows a block diagram illustrating a second variant method for handling plants including machine vision algorithms and robot motion algorithms.
  • Figure 14A shows a block diagram illustrating a third variant method for handling plants providing a more specific example of machine vision algorithms.
  • Figure 14B shows a block diagram illustrating a fourth variant method for handling plants providing a more specific example of machine vision algorithms - schematic of the real-time point localization using feature-based soft margin SVM-PCA method.
  • Figure 15 shows a block diagram illustrating a system map for handling plants including machine vision algorithms and robot motion algorithms.
  • Figure 16 shows the clipping device with a stereo camera installed on a robotic arm.
  • Figure 17 shows examples of clipping points identified by expert farmer selections.
  • Figure 18 shows color value of pixels plotted in four different color spaces; for plant recognition, the concentration of pixels with similar color values in LAB is better than in other color spaces.
  • Figure 19 shows schematic steps of the stem recognition using adaptive color image segmentation based on optimized LAB color space.
  • Figure 20 shows schematic steps of the wooden stake segmentation/recognition.
  • Figure 21 shows comparison of the histogram (left) and kernel density estimation (right) constructed using the same data.
  • the dashed individual kernels make up the kernel density estimator.
  • Figure 22 shows schematic steps for computing the Principal Orientation of the Histogram Gradient.
  • Figure 23 shows schematic steps of a multi-stage point density method to identify the most suitable clipping point along the seedling’s main stem for different types of vegetables including peppers, tomatoes, and cucumbers.
  • Figure 24 shows comparison of plant recognition for three types of seedlings (pepper, tomato, and cucumber) after applying four comparator automatic adaptive segmentation methods and the presently disclosed adaptive segmentation based on feature descriptors (entropy and variance).
  • Figure 25 shows stem and stake recognition of pepper (1), cucumber (2), bell pepper (3), and tomato (4) seedlings after applying the adaptive color image segmentation based on feature descriptors (entropy and variance) and the hybridization of the Otsu method and median filter; different cameras were used to take images in different lighting conditions and backgrounds to check the robustness of the algorithm.
  • feature descriptors entropy and variance
  • Figure 26 shows suggested clipping points after applying the multi-stage point density method; the stereo camera matches left and right images to find the distance of the clipping point from the clipping device and calculates the orientation of the clipping device related to the suggested clipping point.
  • Figure 27 shows suggested clipping points using multi-stage point density algorithm for samples 1 and 2; in sample 3, although a suitable clipping point could be identified on the seedling, the stake is behind a leaf and not accessible; sample 4 shows a case where neither the stem nor the stake was accessible.
  • Figure 28 shows the clipping device and stereo camera on a robotic arm; the stereo camera takes images from the seedling and stalk; after recognizing the suitable clipping point using the multi-stage point density method, the robot moves the clipping device near the recognized clipping point; and the clipping device makes a clip around the stem and stake.
  • Figure 29 shows an impedance controller block diagram of a servo motor in the clipping device.
  • Figure 30 depicts trajectory and torque profile of a servo motor in automated and hand-held clipping device platforms during a complete cycle of clip production.
  • Figure 31 shows a 3D renderings depicting (a) a potential schematic design of a robotic arm, and (b) a gantry system equipped with multiple robotic arms for the stem-stake coupling system, (c) schematic of a robotic stem-stake coupling system with two gantries and nine robotic arms, (d) photograph of a working robotic system with a single gantry and two robotic arms.
  • Figure 32 shows a block diagram of a robotic system that can include multiple gantries (1 to m) supporting varying numbers of robotic arms (1 to n) equipped with an automatic clipping device (ACD), a robotic control unit (RCU), and a stereo camera.
  • ACD automatic clipping device
  • RCU robotic control unit
  • This object-oriented design offers flexibility to adjust the number of gantries and arms as needed.
  • Figure 33 shows a graphical user interface (GUI) of the robotic stem-stake coupling system with fifteen distinct subsections.
  • GUI graphical user interface
  • Figure 34 shows evaluation results of five robotic arm configurations for the stem-stake coupling task based on eleven key parameters.
  • Figure 35 shows a 5-degree-of-freedom (5-DOF) robotic arm, custom-designed and fabricated to fulfill the requirements of an experimentally implemented robotic system; configuration and main components of the robotic arm are shown from the left (a) and right (b) perspectives.
  • 5-DOF 5-degree-of-freedom
  • Figure 36 shows a logic flowchart outline of the robotic arm shown in Fig. 35.
  • Figure 37 shows detailed analysis of the position, torque, and error for each joint in achieving the desired position of the robotic arm. This figure illustrates the discrepancies between the commanded and actual positions, along with the corresponding torque applied at each joint.
  • Figure 38 shows comprehensive examination of the angle, torque, and error for each joint in achieving the desired orientation of the ACD. This figure highlights the differences between the target and achieved joint angles, as well as the torque required for each joint to reach the specified orientation.
  • Figure 39 shows (a) a schematic representation of error calculations; (b) a box-and-whisker plot illustrating the robotic arm’s and machine vision’s accuracy in determining and reaching a specific point within the working space; and (c) shows accuracy and repeatability of the robotic arm in reaching a specific point in the working space - the black (darker colored) dots represent the accuracy and repeatability of the robotic arm alone and the blue (lighter colored) dots indicate the accuracy and repeatability of the integrated system using both machine vision and robotic arms.
  • Fig. 1 shows a perspective view of the clipping device (CD) 10 and its various parts.
  • the CD 10 makes a clip and places it around a wooden stake and the main stem of a seedling or a flower.
  • the clip provides additional support to the plant and avoids damage during transportation.
  • the working principle of the CD 10 is based on feeding a wire 21 or equivalent thin filament from a rotating wire feeder 12 through a rotating wheel (main feeder) 14 driven by an actuator.
  • the wire 21 is pulled from the wire feeder 12 by the actuated rotation of main feeder 14 and is pressed against a feeder supporter 16.
  • the main feeder 14 When the main feeder 14 rotates, it pulls the wire forward against the feeder supporter 16 and pushes the wire through an opening inside the wire guide 18 which defines a bore or channel shape with an input opening 19a proximal to the feeder supporter 16 surface that abuts wire 21 and an opposing output opening 19b proximal to a bender 20.
  • the main feeder 14 pushes the wire through and out of the wire guide 18, more specifically out of output opening 19b, to strike the bender 20 (which is optionally configured as a tunable/ adjustable mechanism that is adjusted to change the diameter or shape of a wire clip) and the force of pushing the wire against a strike surface 81 of the bender 20 shapes the curling wire into a ring shape clip 22.
  • a cutter 24 cuts the wire at the end of each cycle when a full clip is formed.
  • a first actuator such as an electric motor (Servo Motor 1) 26 or other types of actuators such as a pneumatic actuator, turns the rotating wheel of the main feeder 14 and as the wire 21 passes through the wire guide 18 the force applied by the bender 20 shapes the wire into the clip 22. The same actuator also drives the cutter 24 to cut the wire at the end of one cycle to release the clip 22 from the feeding wire.
  • the components of wire feeder 12, main feeder 14, feeder supporter 16, wire guider 18, first actuator 26 are interconnected in a desired orientation to one another by connection to a frame 28.
  • the wire By locating the clipping device near a suitable point near the stem of the plant, the wire can wrap around the stem and the wooden stake as the wire is shaped into a clip 22.
  • the size and shape of the clips can be changed with respect to the seedling or plant.
  • the wire can be fed continuously, with the bender 20 shaping the clip 22 and the cutter 24 releasing the clip 22, and therefore it is not necessary to use pre-made clips in a cartridge.
  • the bender 20 is tunable/adjustable, and a tuning screw 80 on the bender 20 allows tuning of the bender 20 to change an impact or strike point of the feeding wire 21 against the strike surface 81 of the bender 20 and therefore change the force applied to the wire - changing the applied force allows for changing the shape and diameter of the clip 22 as well as the number of overlaps.
  • the CD also includes a clamp comprising fist and second opposing claw-shaped arms (30, 32) that have a specialized shape.
  • the claw-shaped arms are useful to bring the stem and wooden stake close to each other prior to the clipping.
  • a second actuator such as an electric motor (Servo Motor 2) 34 or other types of actuators such as a pneumatic actuator is used to close (and open) the claw-shaped arms.
  • Fig. 1A shows the claw-shaped arms in an open position
  • Fig. IB shows a closed position.
  • the specific shapes of the claw-shaped arms bring the stem and wooden stake closer as it is closed by the second actuator.
  • the same actuator also brings the head or the clip forming portion of the clipping device (while closing the claw-shaped arms) to the appropriate position near the stem and wooden stake i.e., the second actuator translates frame 28 and its connected components relatively closer to the claw-shaped arms concurrent during closing, while translating frame 28 and its connected components relatively farther away from the claw-shaped arms during opening.
  • the claw shaped arms (30, 32) can be considered more generally as an example of a clamp, and the claw shaped arms (30, 32) are first and second opposing jaws (30, 32) of the clamp.
  • the first and second opposing jaws are rotationally mounted to a clamp frame 40 of the device at first and second clamp rotation points (42, 44), respectively, the clamp aligned with the exit of the wire guide 18.
  • the second actuator 34 is pivotably coupled to both of the first and second opposing jaws (30, 32) at a common third clamp rotation point 46, the common third clamp rotation point located approximately equidistant from the first and second clamp rotation points, the second actuator 34 driving counter-rotation of first and second opposing jaws (30, 32) to circumferentially reduce an open space between the first and second opposing jaws (30, 32). More specifically, counter-rotation means that the first and second opposing jaws rotate in opposing directions such that one of the jaws rotates clockwise while the other jaw rotates counter clockwise.
  • the second actuator 34 is coupled to the common third clamp rotation point 46 by a rack-and-pinion transmission.
  • the second actuator 34 is directly connected to a pinion gear 48, and pinion gear 48 engages rack 50 which is attached in a fixed position in clamp frame 40.
  • Rotational motion of pinion gear 48 along rack 50 translates a linear guide holder 52 relative to rack 50, and consequently also translates linear guide holder 52 relative to clamp frame 40 and also translates the common third clamp rotation point 46 relative to clamp frame 40.
  • a linear arm holder 54 extending from the linear guide holder 52 has a proximal end connected to the linear guide holder 52 and a distal end pivotally connected to the common third clamp rotation point 46. Linear sliding of the linear guide holder 52 is aided by bushings 56 slidably engaging linear tracks 58 that are orientated parallel to rack 50.
  • first and second slots (60, 62) formed within the opposing first and second jaws, respectively.
  • Each slot extends from a first end proximal to a capturing surface of the jaw (ie., the claw surface that captures the stem) to a second end distal from the capturing surface and proximal to the linear guide holder 52 and its linear arm holder 54.
  • the first and second slots cross over each other and the crossing point of the first and second slots provides for the coupling of the common third clamp rotation point.
  • each of the first and second slots are rotationally coupled to the clamp frame 40 at the first and second clamp rotation points (42, 44), while the crossing point of the slots is coupled to the distal end of the linear arm holder 54 forming the common third clamp rotation point 46.
  • linear guide holder 52 is attached to frame 28, and therefore as linear guide holder 50 translates relative to clamp frame 40, frame 28 and its attached components also move linearly relative to clamp frame 40 (for example, comparing Fig. 1A to Fig. IB shows that as linear guide holder 52 moves linearly relative to clamp frame 40 and claw-shaped arms (30, 32), frame 28 moves with linear guide holder 52).
  • the materials used to feed the CD can be different depending on the need of the user.
  • the feeding wire can be made of metals such as steel, copper, etc., or plastic material such as polyethylene or polyamide. If the plastic material is used, a heater is typically used for pre-heating the plastic material prior to being fed through the wire guide 18.
  • the energy required to move the mechanisms of the CD can be from AC or DC sources as well as pneumatic or hydraulic actuators.
  • the CD can be used by farmers as a handheld device, or it can be installed on automatic machines or robots. If used as a hand-held device, a single button on the CD allows a farmer to use the CD manually to perform clipping.
  • the manual CD could be used without claw-shaped arms (30, 32).
  • a control board of the clipping device can connect to the automatic machine or robot to follow their commands.
  • the control board can support different kinds of communication protocols such as CAN, I2C, RS232, and RS485 to connect a variety of devices both wired and wireless.
  • Fig. 2 shows wire feeding components of the CD.
  • the main feeder 14 is connected to and driven by servo motor 1.
  • the main feeder 14 is shaped as a wheel with a central groove 74 defined on the perimeter or circumference of the wheel.
  • the central groove 74 provides a wire path in the circumferential center of the main feeder 14.
  • transverse grooves 76 i.e., transverse to the central groove
  • These transverse grooves 76 make small indentations on the wire 21 to push it forward by the force created because of these indentations.
  • the number of transverse grooves and the diameter of the main feeder determine the length of the clip 22. The greater number of transverse grooves results in more wire being pulled, thereby a longer clip length.
  • Fig. 3 shows wire curving components of the CD 10.
  • a hole/bore (bore 19 communicatively extending from input opening 19a to output opening 19b) inside the wire guide 18 along the direction the wire is being pulled.
  • This hole/bore facilitates the feed wire 21 following a direct linear path and reduces and preferably prevents any bending of the wire before it reaches the bender 20.
  • the bender 20 applies a force to the wire 21 to bend it. The amount of the applied force depends on the pulling force of the main feeder. The surface and the angle of the profile of the bender 20 affect the direction of the applied force to the wire.
  • the tuning screw 80 on the bender 20 tunes the direction of the applied force to the wire 21 by adjusting the relative position of the bender (more specifically, a bender strike surface 81 that provides an impact point with wire 21) with respect to the wire guide 18.
  • the amount of applied force and its direction determine the shape and diameter (i.e., the curvature of the clip) as well as the number of wire overlaps in the clip 22.
  • Different surface profiles of the bender can be used to make different clip shapes.
  • the relationship between the relative position of the bender to the wire guide and the diameter of the clip is given in Fig. 7.
  • the number of transverse grooves 76 and diameter of the main feeder 14 determine how fast the wire is pulled, thereby determining the length of the wire in each clip.
  • the relationship between the number of transverse grooves and diameter of the main feeder and the length of a clip is given in Fig. 7.
  • Fig. 4A shows wire cutter components of the CD.
  • Servo motor 1 rotates the main feeder 14 and cutter guide with cam 84 (referenced for brevity as cam 84) - the wheel of the main feeder 14 and the cam 84 are fixed together so as to move co-rotationally.
  • the cam 84 abuts and engages a cutter lever with cam 86 at a first end and cutter 24 mounted on a second end (referenced for brevity as lever cam 86) throughout its rotation.
  • the lever cam 86 is pivotally coupled to the wire guider 18, with a first end of the lever cam engaging the cam 84 and a second end of the lever cam forming the cutter 24 with the bender 20 mounted on top of the cutter 24.
  • Wire 21 is pulled by capture within the circumferential central groove 74 and associated transverse grooves 76 during rotation of the main feeder 14.
  • a resting gap 88 is formed as a flattened circumferential portion on the main feeder.
  • the main feeder rotates about 320 degrees. After that, due to the resting gap on the main feeder, the wire is not pulled by the main feeder.
  • the cam 84 is aligned with the resting gap 88. Therefore, synchronized with a cessation of pulling force due to the resting gap 88, the cam 84 pushes a lever cam 86 downward, causing the lever cam 86 to rotate, which causes the cutter 24 to cut the wire 21 and separate the clip 22 from the wire 21.
  • FIG. 4B shows a resting position and a cutting position of the lever cam 86 and its associated cutter 24 throughout rotation of cam 84.
  • cam 84 For a majority of rotation of cam 84 (for example, approximately 320 degrees) the lever cam 86 and cutter 24 are biased towards a resting position, and when the resting gap 88 faces the wire 21 and the feeder supporter 16, the cam 84 engages the lever cam 86 to pivot the lever cam 86 and cutter 24 into a cutting position.
  • the cam 84 and resting gap 88 are aligned, and the cam 84 and lever cam 86 are also shaped and configured to engage and then disengage within the resting gap 88 portion of the rotation of the main feeder 14 so that the lever cam 86 and its associated cutter 24 pivot to a cutting position and clear the cutting position to return to a resting position aligned with and synchronous with the rotational portion of the resting gap 88 temporarily ceasing pulling of wire 21.
  • Fig. 5 shows the Collector Mechanism/ clamping components of the CD. From resting/open position (Fig. 5A), the servo motor 2 pushes the linear arm holder 54 forward and closes the clawshaped arms. When the linear arm holder 54 comes forward, the claw-shaped arms close and bring together the main stem and wooden stake in the center of the claw-shaped arms. The linear arm holder 54 continues to move forward and close the claw-shaped arms completely (Fig. 5B and 5C). The arms hold tightly the stem and wooden stake and collect them in the center with minimal damage to the stem due to the special shape and profile of the claws (Fig. 5D).
  • first and second slots (60, 62) are shaped biphasic with first and second portions and symmetrically mirrored (first portions distal from the capturing surface of the claw shaped arms and second portions proximal to the capturing surface of the claw shaped arms) so that when the common third clamp rotation point 46 moves forward (in a direction distal to proximal to the capturing surface of the claw-shaped arms) along the corresponding first portion of the slots (60, 62) the claw-shaped arms converge or close by rotating towards each other or conversely when the common third clamp rotation point 46 moves backward (in a direction proximal to distal from the capturing surface of the claw-shaped arms) along this same first portion of the slots (60, 62) the claw shaped arms expand or open by rotating away from each other (Fig.
  • Fig. 6 shows the various components shown in Figs. 1-5 assembled without actuators and support structures for convenience of illustration of interaction of these components.
  • Fig. 7 shows a mathematical model as it applies to the transverse grooves and the pulling of the wire to the feeder, and modifying length of the clip according to equation
  • L is the length of the Clip
  • n is the number of Transverse Grooves
  • a is the angle of the Resting Gap
  • k_l is a material-dependent coefficient that is a weighting factor included in the mathematical model to be able to tune the output value.
  • Fig. 8 shows the relationship between the relative location of the bender 20 and its strike surface 81 to the feed wire 21 and the diameter of the clip.
  • Fig. 9 shows two different profiles of the bender. Changing the location of surface (a) and changing the angle of surface (b). Changing the location of the surface and the angle of the bender changes the direction and amount of force on the wire and the shape of the clip.
  • Fig. 10 shows how changing the cam profile causes the different shapes of the clip.
  • the different diameter of the cam causes rotation of the lever cam which in turn results in changing the position of the bender.
  • Changing the position of the bender will change the direction of the applied force on the wire.
  • the clip has a helical shape.
  • Fig. 11 shows variant wire curving components of the clipping device modified to attach a heat coil to the wire guide to heat a plastic wire passing through a bore formed in the wire guide.
  • a heater element such as a heating coil, heating bar and the like can be used to heat wire passing through the bore of the wire guide to make the wire more bendable, moldable, malleable, flexible, and the like for ease of reshaping wire that exits from the bore and strikes a bender surface.
  • the variant wire curving components shown in Fig. 11 provides a clipping device equipped with a plastic wire roll with a heating coil provided to heat the plastic wire.
  • the clipping device may be equipped with a various materials for making a clip, for example materials such as copper, stainless steel, polyethylene, polyamide, polyesters or different kinds of plastic wires.
  • a heating coil can be coupled to the wire guide heat the plastic materials as it passes through the bore of the wire guide to make the plastic material more bendable, moldable, malleable, flexible, etc. so as to ease reshaping of the plastic material wire exiting the bore of the wire guide by the wire curving components to form a plastic material clip.
  • Fig. 12 shows a block diagram illustrating a first variant method for handling plants including machine vision algorithms and robot motion algorithms.
  • Fig. 13 shows a block diagram illustrating a second variant method for handling plants including machine vision algorithms and robot motion algorithms.
  • Fig. 14A shows a block diagram illustrating a third variant method for handling plants providing a more specific example of machine vision algorithms.
  • Fig. 14B shows a block diagram illustrating a fourth variant method for handling plants providing a more specific example of machine vision algorithms - schematic of the real-time point localization using feature-based soft margin SVM-PCA method.
  • Fig. 15 shows a block diagram illustrating a system map for handling plants including machine vision algorithms and robot motion algorithms.
  • Experimental testing results demonstrate the ability of the currently disclosed device, system and method to benefit plant processing in autonomous, semi- autonomous and manual modes.
  • the following experimental examples are for illustration purposes only and are not intended to be a limiting description.
  • Clipping is the task of putting a rubber band or plastic clip around the seedling’s main stem and a wooden stake at a particular point along the stem to provide additional support to the seedling and avoid damage during transportation.
  • the clipping task involves a human worker bending or kneeling on the floor while using two hands to stretch a rubber band or to put a plastic clip around the plant and the stake. This is a physically demanding and painstaking task, and most modem greenhouses still require a significant amount of manual labour to process a large number of seedlings propagation arena. As an example, in one propagation facility (e.g., Roelands Plant Farms, Lambton Shores, ON., Canada), more than 25 million seedlings grow and are clipped per year.
  • one propagation facility e.g., Roelands Plant Farms, Lambton Shores, ON., Canada
  • the robotic clipping solution contains two main parts; a mechatronic unit that performs the act of clipping and a vision unit that identifies the clipping points.
  • the focus of this paper is on the vision unit, which replicates human visual functionalities and perception to identify a suitable clipping point along the seedling’s main stem for different types of vegetables including peppers, tomatoes, and cucumbers.
  • Machine vision has been widely used to support precision agriculture by providing automated solutions for tasks that are traditionally performed manually.
  • Examples of such methods are optimized image registration and deep learning segmentation approach ⁇ l l/kerkech2020vine ⁇ , adaptive multi-vision technology ⁇ 12/chen2020three ⁇ , and image fusion technology ⁇ 13/li2021 recent ⁇ .
  • image registration and deep learning segmentation approach ⁇ l l/kerkech2020vine ⁇
  • adaptive multi-vision technology ⁇ 12/chen2020three ⁇
  • image fusion technology ⁇ 13/li2021 recent ⁇ .
  • CNN convolutional neural network
  • transfer learning ⁇ 14/bai2022multi ⁇ and point cloud using deep learning CNN ⁇ 15/jayakumari2021 object ⁇ need hundreds of labeled images to train the network for each type of seedling and have large processing time ⁇ 16/kolar2018transfer ⁇ .
  • CNN convolutional neural network
  • the analytical part of the multi-stage point density method is based on the point density variation, kernel density estimation, the principal orientation of the histogram gradient, and normalized cross-correlation for matching.
  • the point density variation calculates the disparity of intensity of colors on a map.
  • the kernel density estimator estimates the population of the finite data sample by smoothing the fundamental data. Using the principal orientation of the histogram gradient, and normalized cross-correlation for matching, the multi-stage point density method suggests a suitable clipping point.
  • the algorithm checks the accessibility of the wooden stake and stem for the robotic arm and clipping device and maps the stereo camera coordinate system to the robot coordinate system to provide necessary sensory feedback for the controller of the robot.
  • a mechatronic unit that includes the clipping device on a general purpose robotic arm (i.e., KUKA LWR IV).
  • Our novel clipping device curls a thin wire to simultaneously make and attach the clip to the plant.
  • An optimized stereo camera (Megapixel/USB5000W02M) has been placed on the clipping device to take images from the plants and send them to the vision algorithm.
  • Fig. 16 shows the installed clipping device and the stereo camera used for evaluation purposes.
  • An automated clipping system may include a plurality of specialized robotic arms equipped with such devices.
  • Suitable Clipping Point Finding a suitable clipping point on the seedlings is the most imperative, challenging, and time-consuming task of the machine vision of the robotic clipping system.
  • the clipping point can be on the highest point, higher than the uppermost node on the main stem. If the length of the main stem is short between two nodes, the clipping point is selected below the highest node or axial. However, the leaves are dense around the highest node, and some parts of the main stem are behind the leaves. Thus recognizing the main stem and petiole is difficult. The different shapes and types of seedlings make the recognition process even harder.
  • the selection of the clipping point is a cognitive process that relies on heuristic information.
  • Fig. 17 shows some seedlings and the preferred clipping points that expert farmers validated.
  • ANN ANN to predict the optimized cut-off values for the locally adaptive threshold for each pixel base on entropy and variance around the pixel.
  • the input features of the ANN are the variance and entropy of the sub-image around the pixel, and the output is the sub-range of the L, A, and B channels' cut-off values for the multilevel threshold.
  • the variance is a measure of variability and provides an indication of how the pixel values are spread.
  • p (mu) is the mean value and for each sub-image around the pixel can be obtained as
  • H(i) Tlf / N
  • n_i is the number of pixels with a gray level of i
  • N represents the total number of the pixels in the sub-image
  • L is the maximum grey level.
  • the entropy measures the average uncertainty of the information source, defined as the corresponding states of the intensity level to which individual pixels can adapt. The higher the value of the entropy is, the more detailed the image is ⁇ 17/deng2009entropy ⁇ .
  • the entropy is defined as follows, where all parameters are as defined before.
  • Fig. 19 shows the block diagram of the stem recognition algorithm. After camera calibration using Zhang's method ⁇ 20/zhang2000flexible ⁇ , the quality of the images was enhanced and restored using the pre-processing methods that were a combination of equalization techniques, high-boost filters, and morphological boundary enhancement ⁇ 21/thapar2012study ⁇ .
  • the morphological filtering techniques were used to remove noises from the segmented stem ⁇ 22/ruchay2017impulsive ⁇ . Using the hit-and-miss, thinning, and convex -hull techniques, the leaves were then eliminated.
  • Wooden Stake Recognition Recognizing the wooden stake inserted beside a seedling is more straightforward.
  • the hybridization of the Otsu method and median filter ⁇ 23/pacifico2018hybrid ⁇ can be used for stake recognition.
  • Fig. 20 shows the schematic steps of the stake segmentation method. The wooden stake is almost vertically straight. Thus, hidden and covered parts of the stake can be found using simple partial spline matching.
  • the multi-stage point density method uses the boundary and skeleton of the seedling to find the region of interest and limit the search area.
  • the borders of the region of interest are computed using Equations (3), (4), and (5) below,
  • S i and SJ are the mean values of the skeleton in the x and y directions for all non-null pixels
  • P_r, P l, and P t are the right, left, and top values of the seedling boundaries for non-null values
  • S(i) and S(j) are the values of the skeleton in pixel (i,j)
  • >_r, ⁇ j>_l, and ⁇ j>_t are the number of null values for right, left, and top of the boundary of the plant, respectively.
  • Point Density Variation shows the disparity of intensity of colors on a map ⁇ 24/lawin2018density ⁇ .
  • a Gaussian mixture model represents a distribution for each color channel i as, where 7i_ ⁇ i_ ⁇ k ⁇ ⁇ are the mixing coefficients that meet the following condition,
  • the density P_i(x) is the Gaussian distribution of intensities in each color channel
  • p_ ⁇ i_ ⁇ k ⁇ ,Z_ ⁇ i_ ⁇ k ⁇ ⁇ ) is the Gaussian density with the mean value, p_ ⁇ i_ ⁇ k ⁇ ⁇ and the variance E_ ⁇ i_ ⁇ k ⁇ .
  • Kernel Density Estimator In most computer vision and pattern recognition applications, the feature space is complex, noisy, and rarely can be described by the common parametric models, and non-parametric density estimation techniques have been widely used to analyze arbitrarily structured feature spaces ⁇ 26/yang2003improved ⁇ .
  • the kernel density estimator as a non-parametric density estimation technique calculates the density of features in a neighborhood around those features and the density function is estimated by a sum of kernel functions (typically Gaussians) centered at the data points ⁇ 27/elgammal2002background ⁇ .
  • a bandwidth associated with the kernel function is chosen to control the smoothness of the estimated densities and more data points allow a narrower bandwidth and a better density estimate, and the kernel density estimator spreads the known quantity of the population for each point out from the point location of random non-parametric variables ⁇ 28/matioli2018new ⁇ .
  • point density variation estimates the density of stem intensity
  • kernel density estimation is a fundamental data smoothing technique, as shown in Fig. 21, that inferences about the population of the finite data sample ⁇ 29/scaldelai2022multiclusterkde ⁇ .
  • the kernel density estimator is defined as, where K is the kernel, h > 0 is a smoothing parameter called the bandwidth, and scaled kernel.
  • the Principal Orientation of the Histogram Gradient represents a signature of the features of spatial regions ⁇ 31/lauria2018nonparametric ⁇ , ⁇ 32/wiangsamut2022fast ⁇ .
  • the Region of interest was divided the region of interest into sub-images (voxels) with a size of about 1.5 times the stem's average thickness. The boundary of the stem was used to calculate the stem's average thickness ⁇ 33/wang2020fruit ⁇ . This value can also be assigned by the user.
  • the normalized correlation metric was used to match the histograms, i.e., where H2 are the his + tograms of the candidate voxel and ground voxels with the same normalized principals of long histogram gradient.
  • H2 are the his + tograms of the candidate voxel and ground voxels with the same normalized principals of long histogram gradient.
  • Fig. 23A summarizes the step-by-step of the proposed algorithm for bell pepper seedlings.
  • the image is transferred to the LAB color space, and the adaptive color image segmentation and hybridization of the Otsu method and median filter are applied to recognize the stem and stake.
  • the boundary and skeleton of the plant using morphological image processing operations are obtained next.
  • the region of interest is defined using the boundary and skeleton.
  • a combination of recursive dilation and erosion and morphological techniques such as Hit and Miss, convex hull, and thinning are used to eliminate the leaves from the plant to find the stem.
  • the third row contains the results of applying point density variation (Fig. 23B shows a magnification of the point density variation plot) and kernel density estimator for finding the most suitable clipping point on the stem.
  • the stake is checked to determine whether there is a corresponding point on the stake. If accessible, the multi-stage point density method suggests the point and calculates the distance and depth using the images from the stereo camera ⁇ 34/dandil2019computer ⁇ . Finally, the multi-stage point density algorithm calculates the most suitable orientation and position of the clipping device in the real coordinates of the robotic arm.
  • MSE mean square error
  • MSE mean square error
  • the multi-stage point density method After adaptive segmentation, we applied the multi-stage point density method on tree types of seedlings to find the correct position of the clipping points and evaluate the quality of the results.
  • the first category contained 120 images of bell pepper seedlings
  • the second category had 80 images of tomatoes
  • the third category contained 80 images of cucumbers.
  • Table 2 shows the success rate of finding the suitable clipping point for each type of seedling. The leaves of cucumbers are big and access to the stem on top of the plant is difficult. Thus, the success rate for cucumbers is less than other seedlings.
  • Fig. 26 shows the recognized clipping points on the seedlings using the multi-stage point density method.
  • Fig. 28 shows examples of seedlings with clips on them.
  • Experimental Example 1 proposes a new approach for finding the most suitable clipping point for a new robotic clipping system under development.
  • the proposed approach is conceptually different from other feature detection methods in that it combines analytical image processing methods and data-driven learning algorithms. This allows us to solve the challenging problem of clipping point detection.
  • the success of our adaptive segmentation approach was in part due to the use of the variance and entropy of voxels as two effective features for tuning the local cut-off values.
  • the final results of the algorithm i.e., the identified clipping points, were verified by expert farmers to validate the efficacy of the algorithm. As a whole, the obtained results indicated satisfactory performance in finding the most suitable clipping point.
  • ACD automatic stem-stake coupling device
  • HCD stem-stake coupling device
  • ACD and HCD utilize interconnected mechanisms to create clips of various sizes and shapes from metallic wire. These mechanisms include Pushing Mechanism, Curving Mechanism, Cutter Mechanism, and, for the ACD specifically, Collector Mechanism. Both devices operate on the principle of feeding a thin wire from the Feeding Wire Spool.
  • the wire is pulled by the Main Feeder and is pressed against the Feeder Supporter. As the wire is pulled it moves through the Wire Guider while a Bender shapes it into a ring-shaped clip. At the end of this cycle, the Cutter cuts the wire when a full clip is formed.
  • a first actuator such as an electric motor (Servo Motor 1) turns the Main Feeder and drives the Cutter as well.
  • the clipping mechanism incorporates an Optical Sensor that sends a pulse to indicate the completion of each cycle. By positioning the clipping device near the stem of the plant, the wire wraps around the stem and wooden stake. The size and shape of the clips can be adjusted based on the seedling or plant using a Tuning Screw to adjust the Bender.
  • the ACD is equipped with a stereo camera and claw-shaped arms.
  • a second actuator such as an electric motor (Servo Motor 2), rotates the claw-shaped arms, bringing the head of the clipping device closer to the stem and wooden stake and closing the arms.
  • the ACD is integrated into a robotic system.
  • a vision system utilizes stereo images to provide real-time information about the optimal orientation and position of the stem-stake coupling point, as well as the 3D spatial coordinates of the ACD, which are then transmitted to the robotic arm.
  • the impedance control method is employed to regulate the speed and torque of the servo motors based on the desired shape and size of the clips.
  • Various materials can be used to produce the clips, including metals like steel and copper wire, as well as plastic materials such as polyethylene or polyamide wire.
  • copper wire is often selected based on growers’ preferences.
  • a heater is required to preheat the material before feeding it through the Wire Guider.
  • the pushing mechanism’s role is to exert force and propel the wire forward at a specified velocity.
  • the Main Feeder is connected to the Main Servo Motor (Servo Motor 1).
  • the Main Feeder has a Central Groove around its perimeter, which ensures the wire stays centered as it moves. Additionally, there are Transverse Grooves perpendicular to the Central Groove on the Main Feeder’s perimeter. These Transverse Grooves create small indentations on the wire, propelling it forward as the Main Feeder rotates.
  • the length of the clip depends on the diameter of the Main Feeder, the number of Transverse Grooves, and the arc length of the Resting Gap.
  • This empirical model states the relationship between the mentioned parameters and the length of the clip may be expressed as Equation 1 discussed above with reference to Fig. 7.
  • Curving Mechanism bends the wire into the desired shape. As the wire moves forward, it passes through a Wire Guider and then encounters a Bender. There is a hole/bore/channel inside the Wire Guider along the direction the wire is being pushed. This hole ensures that the wire follows a straight path and prevents any bending before it reaches the Bender. The Bender applies a normal force to the Wire to bend it. The normal force applied on the wire is proportional to the pulling force of the Main feeder.
  • This force determines the shape and diameter (i.e., the curvature) of the clip, as well as the number of wire overlaps.
  • the surface and the profile angle of the Bender affect the direction of the applied normal force to the wire.
  • the Tuning Screw installed on the Bender allows tuning the relative position of the Bender due to the Wire Guider for adjusting the direction of the applied force.
  • the surface profile of the Bender effects different clip shapes.
  • Fig. 9 shows two examples of different mechanisms with different profiles that result in producing clips with different diameters.
  • a first illustrative mechanism involves the Rotational Mechanism with a flat surface of the Bender.
  • Adjusting the Bender’s angle changes the orientation of its flat surface relative to the wire, thereby altering the force exerted on the wire and resulting in a different radius of the clip. For instance, the clockwise rotation of the Bender increases the force on the wire, resulting in smaller clip radii, and vice versa.
  • Fig. 9A shows the Positional Mechanism in which the angle of the Bender does not change, instead the position of the Bender changes relative to the Wire Guider. In this mechanism, the surface of the Bender is curved. Thus, a vertical movement of the Bender alters the force applied to the wire, resulting in a change in the clip’s radius.
  • the relationship between the clip’s radius and the Bender’s position is illustrated in Fig. 8. Additionally, Fig.
  • the Cam 10 demonstrates how adjusting the Cam profile produces various clip shapes.
  • the Cam has different radii at different points. So, when Cam pushes Lever Cam, the rotational angle of the Lever Cam varies. The rotation of the Lever Cam changes the position of the Bender, thereby altering the amount of applied normal force on the Wire. As a result, the clip shape will be spiral.
  • the benefit of the spiral clip lies in its ability to securely grasp the stem and stake, even when they are not closely positioned, particularly when the clip’s initial radius is relatively larger. In the final step of making the clip, the smaller radius of the clip tightens the stem and stake together.
  • Both the ACD and HCD utilize a single servo motor to push, curve, and cut the wire through an integrated mechanism.
  • the servo motor rotates the Main Feeder and a Cam at the end of each cycle.
  • the Main Feeder rotates about 5.8 rad while pushing the wire forward and forming a clip.
  • the Main Feeder incorporates a Resting Gap to halt the wire feed before cutting. This allows the rotation of the Main Feeder and engages the Cutter via a Camshaft and Cam Lever, severing the wire.
  • the ACD is equipped with a unique collector mechanism that uses claw-shaped arms.
  • Fig. 5A-5E illustrate the collector mechanism and multiple steps of closing the claw-shape arms and repositioning of the ACD’s head.
  • a second servo motor, Servo Motor 2 causes the claw-shaped arms to close.
  • the Arm Holder continues to move forward to fully close the claw-shaped arms to securely hold the stem and stake in the center of the arms with minimal damage, due to slots of the claw-shaped arms and shape of the claw-shaped arms. By closing these arms, the stem and stake are held together in the center of the arms. Additionally, the collector moves the ACD’s head closer to the stem and stake without further closing the claw-shaped arms before creating the clip, as shown in Fig.
  • MOTOR CONTROL The power for pushing, curving, and cutting the wire is produced by the Main Servo Motor in both HCD and ACD.
  • the load on the servo motor may vary dynamically at each step.
  • the rotational speed of the servo motor also influences both the shape and the quality of the clip. Therefore, as the servo motor maintains a consistent force in a specific angular velocity in the presence of dynamic external torques it benefits accurate bending of the wire.
  • torque control is significant to achieve optimal performance and prevent stalling or overloading. Torque control allows the motor to adjust its output torque to compensate for changes in the load, ensuring repeatable motion control with consistent performance.
  • the control objective of an impedance controller in our system is to impose a desired dynamic relationship between the servo motor’s position and the force of interaction with the wire.
  • Impedance is defined as the ratio of the force to the position.
  • impedance control enables the motor to behave as a massspring-damper system, commanding the desired position in response to the interaction force of the servo motor and external factors.
  • the impedance controller In response to external force f_e, the impedance controller generates a modified position 80 as follows, where s represents the Laplace Transform variable and all other parameters are as defined previously.
  • the impedance controller remains stable as long as m_d, b_d and k_d are positive values.
  • the intrinsic inertia of the Main Feeder is due to its mass, simplifying the choice of desired impedance to the selection of b_d and k_d.
  • Fig. 29 depicts the block diagram of the proposed impedance controller.
  • the outer loop naturally closes when the servo motor encounters external torques.
  • the impedance function uses the estimated torque feedback, the impedance function generates 80 and commands the desired angle 0d.
  • the inner loop consists of a PID controller to track the desired trajectory for achieving suitable movement of the Main Feeder and, consequently, other parts such as the Bender and Cutter.
  • Both ACD and HCD are capable of producing clips in various shapes with the required diameters, including simple and spiral forms.
  • Fig. 7 illustrates the correlation between clip length and the number of Transverse Grooves. Adjusting the bender allows for changes in the diameter and shape of the clip.
  • Fig. 30 depicts the trajectory and torque profile of the Main Servo Motor across both the HCD and ACD platforms during a complete cycle of clip production.
  • the ACD and HCD produce a clip in approximately 0.95 seconds.
  • the impedance controller regulates the Servo Motor to exert accurate force on the wire, maintaining a specific trajectory with the desired angular velocity during clip production.
  • the applied pushing force on the wire initiates from 0.08 seconds into the process and lasts until 0.72 seconds.
  • Wire cutting commences at 0.8 seconds and concludes at 0.92 seconds. This visualization highlights how the introduced impedance controller influences motor performance, affecting the precision and consistency of clip formation.
  • RSCS Robotic stem-stake coupling System
  • the robotic arm rotates around the seedling to locate an appropriate clipping point, recognizes this point, and positions the ACD correctly through the process that takes time.
  • the average duration of these steps is detailed in Table 4 for four primary seedlings: Beit Alpha cucumber, chili pepper, bell pepper, and tomato.
  • Table 4 also presents the acceptance rates of clip quality for each type of seedling. According to Table 4, two robotic arms equipped with ACD can achieve the speed of one grower.
  • the selected point should be the highest point, above the uppermost node on the main stem.
  • Pepper seedlings typically have a straight stem with sufficient spacing between nodes, resulting in a higher acceptance rate for bell peppers compared to other seedlings.
  • Tomato seedlings on the other hand, have a relatively short main stem between nodes. Additionally, cucumber leaves tend to be denser around the highest node, which can occasionally lead to issues during clipping as some parts of the leaves may get caught between the two claw-shaped arms, slightly reducing the clip quality.
  • HCD growners utilizing HCD in propagation facilities and greenhouses can achieve approximately 86% greater efficiency compared to those employing plastic clips for stem-stake coupling task.
  • HCD not only facilitates more efficient operations but also allows growers to allocate their time to other critical tasks.
  • HCD offers cost-saving benefits, further enhancing its value in greenhouse and propagation facility management.
  • the HCD simplified the clipping task, allowing farmers to work longer with a lesser risk of back injuries due to chronic bad posture.
  • automatic clipping in large propagation facilities using ACD accelerates the process for millions of seedlings and plants with fewer growers.
  • the developed robotic stem-stake coupling System featuring three gantries and 24 robotic arms equipped with ACDs, has the capacity to couple the stems of 12,000 bell pepper seedlings to stakes within an hour.
  • HCD and ACD utilize sustainable, eco-friendly materials, offering a viable alternative to traditional plastic clips.
  • recyclable or biodegradable materials By using recyclable or biodegradable materials, these systems reduce environmental impact and support broader efforts to minimize waste and enhance sustainability in agricultural and horticultural practices.
  • seedlings are cultivated using different methods.
  • One style of cultivation is on the concrete floor, also known as the folding floor.
  • Another style is cultivation on the tray system.
  • both styles are used, where seedlings are transferred from the concrete floor to the tray system semiautomatically using specialized equipment.
  • To prepare the seedlings for clipping before transportation, other machinery in greenhouses, are used to rearrange the seedlings by automatically altering the spacing between them for easier access.
  • the tray system carrying the seedlings is passed in front of human workers, who affix a plastic clip around the seedlings and the wooden stake.
  • a solution benefits by meshing with existing technology used in propagation facilities, allowing for smooth integration with other automated machinery and devices while reducing disruption or cost increases. Therefore, the most recommended robotic solutions are those that can be installed directly where growers perform clippings.
  • Multiple methods may be employed to access seedlings for clipping tasks using a robotic system.
  • Options include employing a mobile robot, utilizing a gantry to maneuver a robotic arm around seedlings, or employing a fixed robotic arm with a carrier transporting seedlings within its workspace.
  • Each strategy offers advantages in terms of efficiency, adaptability, and precision in seedling handling.
  • One strategy is a concept involving compact robotic arms with a restricted operational range. A mobile gantry system moves the arm towards the seedlings, while the gantry glides along the rails of the tray system. This setup facilitates the development of small, simple, and lightweight robotic arms.
  • An alternative approach is to mount the robotic arm to a fixed point on the ground in front of the tray stream.
  • the tray system is passed in front of the robotic arm, the arm can access and clip the seedlings similar to human workers.
  • This approach simplifies operations and setup time for vision system operation. However, it relies on the moving tray system and cannot be easily adapted for clipping seedlings on the concrete floor.
  • Another approach is a mobile robot carrying the gantry and a robotic arm.
  • This configuration eliminates reliance on moving trays and can be used for tray systems of various sizes as well as the concrete floor to provide access to seedlings.
  • the robotic arm can operate autonomously among seedlings to perform the clipping task.
  • the disadvantage of this solution is the difficulty in managing multiple such robots. Additionally, the robotic arm’s stability may be compromised, potentially causing issues due to vibrations.
  • Another concept involves a four-wheel mobile system carrying a robotic arm near the seedlings. This method offers advantages in flexibility and adaptation to both cultivation styles.
  • each robotic arm can be strategically positioned around the seedlings as shown in Fig. 4.
  • Each robotic arm is equipped with a camera to determine the optimal stem-stake coupling point and assess the seedling from various angles. If a suitable point is identified, the clipping action is performed. If not, the other robotic arm attempts the task.
  • the gantry-style solution is more appropriate for accommodating a multi-arm solution.
  • the limited space around seedlings on the tray system may impede access to seedlings.
  • the spacing between seedlings on trays can be adjusted to enhance accessibility for the robotic arms. This re-spacing is a common practice in manual stem-stake coupling as well
  • Robotic System Framework After evaluating various possibilities, it appears that adding a gantry system with multiple robotic arms onto the existing tray system is the optimal choice for the propagation facilities. This approach eliminates the need for modifying rails, and trays, or rearranging seedlings, reducing additional workload. While the approach is shown for the tray system it can be adapted to the concrete floor style. The number of gantries and robotic arms can be modified depending on the requirements of the facility and volume of seedlings, due to the object- oriented design of the robotic stem-stake coupling system.
  • Fig. 3 IB illustrates a schematic representation of the robotic stem-stake coupling system, s featuring two gantries and nine robotic arms equipped with automatic clipping devices.
  • the robotic stem-stake coupling system has six major components, including a Master Control Panel (MCP), Gantry, Robotic Arm, Automatic Clipping Device (ACD), Robotic Control Unit (RCU), and machine vision System.
  • Fig. 32 shows the interconnection of these components.
  • MCP Master Control Panel
  • the Master Control Panel is a centralized interface enabling monitoring and control of key components within the robotic system. It acts as a central hub for coordinating operations, offering users access to essential controls, data, and functionalities. Access to the MCP is available through four distinct interfaces.
  • the Graphical User Interface is a visual interface that allows users to interact with the entire system through graphical icons and visual indicators.
  • Fig. 33 shows different modules of the GUI and brief information of the numerically labelled GUI modules are as follows:
  • Each joint can move independently at varying speeds or synchronously at set speeds.
  • Motor Joystick The user can adjust the ACD’s position and orientation using buttons.
  • Sending Code Users can send commands to control motors and monitor sensors.
  • Clipping Device Indicating ACD status; users manage ACD at various control levels.
  • Motor/Sensors Monitoring motors and sensors.
  • Machine Vision Users can view stereo images and ML results, choose image processing methods, and direct the robotic arm to automate tasks with specified priorities, speeds, directions, and autonomy levels.
  • Main Buttons Users select a port number, connect to the robotic arm, and access functions like emergency stop, reset, or exit the GUL
  • Program/Code Users can input or modify code for compilation, execute it line by line (forward or backward), pause/resume execution, or halt at each step.
  • Control Tray Interface with third-party devices.
  • Compiler Panel Displaying the code and allowing users to track its compilation progress.
  • TCP/IP Connection Enabling remote robotic control via an internet connection, accessible from devices like Android.
  • the smartPAD serves as the interface for overseeing and managing the robotic system, equipped with touch-screen functionality and connectivity options such as wired or remote connections. This enables improved functionality and mobility of the robotic system. It assists the user in maneuvering around the trays and overseeing and managing the robotic system.
  • the TCP/IP offers the possibility of connecting the robotic system to the internet and controlling it remotely.
  • This setup permits remote communication between the robotic system and the GUI, allowing farmers to send commands and receive feedback from anywhere with internet access.
  • farmers can observe the real-time status and performance of the robotic system via remote interfaces, which involves monitoring sensor data, tracking the robot’s location, and overseeing task progress.
  • users can remotely control the robot’s movements and actions and can issue high-level commands to the robotic system, enabling it to autonomously execute predefined tasks.
  • Remote access to robotic systems via TCP/IP facilitates maintenance tasks and diagnostics.
  • the developed Compiler allows farmers to write highlevel commands abstractly and intuitively, without needing to modify the low-level details of the robotic system’s hardware or communication protocols. Utilizing the compiler simplifies programming the robotic system and enhances accessibility for users of different technical skill levels. Users can remotely modify or enhance the functionality of the robotic system, while the system itself, equipped with high-level intelligence, can execute advanced commands in diverse conditions.
  • Gantry refers to the large and rigid framework that supports and guides the movement of the robotic arms, linear guides, rails, lights, cable carrier chains, and other tools.
  • the Gantry provides a stable and secure structure for mounting robotic components. It ensures that the components are properly aligned and supported during operation, minimizing vibrations and inaccuracies in movement. It is structurally robust to withstand the stresses associated with the dynamic movements and the weight of the robotic components as well as any payloads they are carrying.
  • the Gantry framework ensures synchronized motion of the various components and includes feedback sensors to adjust the position of the robotic arms with precision. It is designed with flexibility, allowing for customization and adaptation to different environments in propagation facilities.
  • gantries can be installed on automated tray conveyance lines transporting seedling trays.
  • the Gantry design avoids modifying conveyor lines during installation.
  • Each Gantry can accommodate up to four robotic arms on each side.
  • the Gantry’ s specialized design and object-oriented programming allow for adding up to eight robotic arms without requiring hardware or software modifications.
  • Robotic Arm The robotic arm’s responsibility is to place the ACD at the suitable stem-stake coupling point, ensuring it is correctly oriented while maneuvering along a specific route to avoid collisions with surrounding seedlings, all to complete the task efficiently.
  • the design of the robot arm incorporates considerations for speed, affordability, reliability, and ease of maintenance.
  • the robotic arm should afford simplicity, eliminating the necessity for robust and costly hardware for calculations of inverse kinematics, and path planning, as well as the control system. It should be lightweight and equipped with the features of an intelligent multi-agent system to facilitate communication with adjacent robotic arms. Choosing an optimal configuration for the robotic arm from a range of possibilities is advantageous to achieving a desired performance for the robotic system. In evaluating various robotic arm configurations for the stem-stake coupling task, we took into account eleven key parameters as follows:
  • Payload Capacity The maximum weight the robotic arm can handle without compromising performance or safety.
  • Dexterous Workspace The area within the robotic arm’s workspace where it can reach points from various orientations, offers increased flexibility and versatility in manipulation tasks.
  • the robotic arm can easily reach and operate within different areas of its workspace, considering obstacles, joint configurations, and potential collisions.
  • Precision The repeatability of the robotic arm’s positioning, ensuring consistent performance in manipulating objects.
  • Speed The rate at which the robotic arm can move impacts efficiency and cycle time for completing tasks.
  • Vibration The level of oscillatory motion or shaking exhibited by the robotic arm during operation, which can affect accuracy, precision, and performance.
  • Control System the hardware and software components responsible for programming, monitoring, and executing tasks with the robotic arm, ensuring efficient operation and coordination of movements.
  • Computation Cost The computational resources required to control and operate the robotic arm, including processing power, memory, and energy consumption.
  • Fig. 34 Five different configurations of robotic arms were selected and evaluated to determine the most suitable one for the robotic system, as shown in Fig. 34.
  • the PPPRRR configuration emerges as the most suitable option for the robotic system.
  • the initial three prismatic joints primarily handle the positioning of the ACD, while the subsequent three revolute joints govern its orientation.
  • the robotic arm features three prismatic joints that adjust the position of the ADC relative to the main reference frame at the suitable clipping point. Additionally, there are two revolute joints responsible for orienting the ADC.
  • the arm can handle payloads of up to 20 Newtons.
  • the Robot Control Unit Controls the robotic arm and manages the movement of each joint to achieve desired configurations or trajectories, allowing the arm to perform its designated tasks effectively.
  • the RCU utilizes the PID controllers to regulate the movement of individual joints, and it incorporates linear functions with parabolic blends (LSPB) for trajectory planning.
  • LSPB parabolic blends
  • the RCU ensures that the robotic arm maintains an accuracy of 0.1 mm and achieves a maximum speed of 400 mm/sec. Additionally, it guarantees the orientation accuracy of the ACD to be 0.2 degrees and enables a maximum angular velocity of 120 degrees/sec. Users can utilize a keypad to send commands to control the robot independently.
  • Fig. 36 presents the robotic arm’s logic flowchart, outlining the sequence of steps along with inputs, outputs, and loops.
  • seedlings are arranged in a predetermined order on the tray, and the position of each seedling is accurately known on the tray with an accuracy of approximately 5 cm.
  • the MCP evaluates the distance between the seedlings and the robotic arms to determine which seedlings should be clipped by each specific robotic arm. Based on the approximate position of the seedling on the tray and the robotic arm’s position in relation to the tray, the robot places the ACD near the desired seedling. Afterward, it follows the steps outlined in the flowchart to identify the appropriate point for stem-stake coupling. If obstructed by dense foliage, the robotic arm adjusts the ACD’s height or repositions it around the seedling to pinpoint a suitable location and finalize the coupling process. Alternatively, it may assign the coupling task to other robotic arms.
  • the ACD is equipped with a Megapixel/USB5000W02M stereo camera which captures images of the seedlings. These images are then transmitted to the machine vision System for further processing.
  • the machine vision System enables each robotic arm to analyze images and find the most suitable stem-stake coupling point using various algorithms and techniques.
  • the initial stage of the vision algorithm involves seedling recognition.
  • the machine vision System employs an adaptive feature-based plant recognition technique to accurately segment the seedling from other elements present in the image.
  • the algorithm utilizes four distinct techniques to determine the clipping point, considering factors like seedling type, environmental conditions such as lighting, seedling age, and the presence of other seedlings in the image.
  • the first technique is straightforward, as it finds the top of the seedling and identifies 3 to 5 centimeters below that point as the clipping point, depending on the type of the seedling.
  • the second technique uses real-time point recognition using kernel density estimators and pyramid histogram of oriented gradients (KDE-PHOG) [M. Asadi Shirzi, M. R. Kermani, Real-time point recognition for seedlings using kernel density estimators and pyramid histogram of oriented gradients, in: Actuators, Vol. 13, MDPI, 2024, p. 81.].
  • Real-time point localization using featurebased soft margin SVM-PCA method (RTPL) [M. Asadi Shirzi, M. R.
  • Kermani Real-time point localization on plants using feature-based soft margin svm-pca method, in: IEEE Transactions on Instrumentation and Measurement, IEEE, under processing, 2024.] represents another rapid and precise method for identifying the most suitable stem-stake coupling point.
  • YOLO-v8 [T. Han, T. Cao, Y. Zheng, L. Chen, Y. Wang, B. Fu, Improving the detection and positioning of camouflaged objects in yolov8, Electronics 12 (20) (2023) 4213], a deep learning algorithm, presents another option for identifying the stem-stake coupling point.
  • users have the option to manually define the point on the image using Manual Cursor Selection. This allows users to move the cursor across the image using the mouse and click on the desired point, which is then designated as the appropriate coupling point.
  • a novel automatic stem-stake coupling device has been specifically designed and integrated into the robotic system.
  • ADC automatic stem-stake coupling device
  • the claw-shaped arms are closed to bring the stem and the stake together.
  • a thin wire is then fed and guided through specialized components that shape it into a desired circular clip. Then the wire wraps around the stem and wooden stake. Finally, the Cutter trims the wire upon completion of the clip formation.
  • the size and shape of the clips can be adjusted based on the seedling or plant.
  • the stereo camera is installed on the ACD to provide real-time images for the machine vision System.
  • RESULTS To assess the performance of the robotic system, we’ve developed a robotic stem-stake coupling system comprising one gantry and two robotic arms, which we installed on a specialized tray commonly used in propagation facilities for transporting seedlings.
  • the robotic system performed the stem-stake coupling task on three types of seedlings commonly grown in greenhouses: cucumber, pepper, and tomato.
  • Precision TP / (TP + FP) (18) where true positive (TP) represents the number of correctly detected stem-stake coupling points, false positive (FP) is the number of miss-detection points, and false negative (FN) is the number of false alarms. Recall serves as an indicator of false detection, measuring the system’s ability to identify relevant instances. Precision, conversely, reflects the success rate of correct detection, illustrating the system’s accuracy in identifying the desired outcomes. We enlisted the expertise of farmers to determine which clips corresponded to TP or FN. These results are presented in Table 5. As observed, the RTPL method exhibits superior performance compared to the other three methods.
  • Table 7 displays the numbers of seedlings per tray and the stem-stake coupling speed allocated by each grower across three distinct plants in a propagation facility arena (this data originates from the propagation facility at Roelands Plant Farms). Furthermore, Table 8 includes the stem-stake coupling speed for two distinct configurations of the robotic system.
  • the initial setup (Pl) comprises a single robotic arm mounted on a Gantry, while the second configuration (P2) features twenty-four robotic arms distributed across three gantries.
  • the stem-stake coupling speed of a singular robotic arm within the robotic system is lower compared to the setup where multiple robotic arms are situated closely together on gantries. Because robotic arms do not need to move much to reach the seedlings.
  • the stem-stake coupling speed of a grower is approximately twice as fast as that of a robotic arm.
  • the robotic arm used machine vision calculations to reach these predetermined coordinates. After identifying and reaching the target point, we measured the error in the X, Y , and Z directions (e x, e_y, e_z), which are due to errors in both positioning and orientation of the ACD (Fig. 39a). Then we calculated the robotic arm’s cumulative error e_a, which is:
  • Fig. 39b displays the box-and-whisker plot illustrating the accuracy of the robotic arm and machine vision in determining and reaching the position and orientation of a specific point within the robotic arm’s working space.
  • Fig. 39c illustrates the accuracy and repeatability of the robotic arm in reaching a specific point in the working space (black dots), as well as the performance of the integrated system where machine vision calculates the spatial coordinates of the point from images, and the robotic arm moves the ACD to the identified point (blue dots). Determining the spatial coordinates of a point through stereo matching and disparity reduces the accuracy of the integrated robotic arm and machine vision system.
  • the cumulative error across all 5 joints results in the robotic arm’s accuracy being within less than 1 mm. If the integrated system of machine vision and the robotic arm achieve positional accuracy within 10 mm, the claw-shaped arms of the ACD can correct for any discrepancies by precisely aligning the seedling’s stem and the stake at the center. This integrated system demonstrates a positional accuracy within 10 mm in 97.5% of instances.
  • the proposed system featuring three gantries and 24 robotic arms, has the potential to replace the work of 12 expert farmers in a propagation facility. With an average success rate of 89%, the system can identify unsuccessful attempts and flag them, allowing two additional farmers to complete the remaining coupling tasks. This effectively reduces the reliance on expert farmers from 12 to just 2.
  • the RTPL technique employed within the machine vision system, exhibits superior performance in accurately identifying the stem-stake coupling point compared to other techniques. Each coupling task costs less than 1.5 cents when executed by the robotic system, in contrast to the 3 cents incurred when completed manually, which includes the labor cost and the cost of clip itself.
  • ACD cost reduction achieved through the use of ACD is attributed to its utilization of thin wire to produce clips, as opposed to using pre-made plastic clips as well the reduction of labor costs.
  • This approach requires inexpensive materials and involves a clip-making process that consumes minimal energy.
  • the use of metallic wire clips enhances environmental sustainability by reducing plastic waste.
  • the robotic system Due to the object-oriented design of the robotic system, it is feasible to modify the configuration to enhance the stem-stake coupling speed.
  • the robotic system is compatible with automated greenhouses and propagation facilities, requiring minimal alteration to their existing automatic lines.
  • the robotic system ensures maximum efficiency and productivity by allowing tha task to be carried out around the clock with consistent performance and reliability.
  • Each of the 24 robotic arms consumes 40 watts of power, resulting in a total system power usage of less than 1 kilowatt. This demonstrates the energy efficiency of the robotic system, as it minimizes power consumption while maintaining optimal performance.
  • the integrated machine vision and robotic arm system demonstrate a positional accuracy within 10 mm for 97.5% of attempts, with the cumulative error across all five joints resulting in the robotic arm achieving an accuracy of less than 1 mm. This level of precision makes the robotic system suitable for various precision agricultural applications.
  • the currently disclosed device, system and method provide plant processing in autonomous, semi-autonomous and manual modes.
  • Image sensor and machine vision components combined with a robot and robot controller provide autonomous or semi -autonomous of processing plants, and specifically identifying and effecting target points in plants.
  • the clipping device as a hand-held device still provides a manual benefit as a hand-held clipping device/gun.
  • the clipping device used by either a fanner for semi-autonomous clipping or an automatic machine or a robot for fully autonomous clipping can have an immense impact on the efficiency of the processes, reduction of the labour costs associated with clipping tasks, and the prevention of work-related injuries, such as back injuries, strains, and sprains, associated with awkward body position for clipping.
  • the clipping device can accommodate a variety of materials for making a clip such as copper, stainless steel, polyethylene, polyamide, polyesters or many variety of plastic wires including thermoplastic or thermoset materials. Many plastic blends or composites or metal alloys may be used.
  • a clipping point is an example of a target point, and the machine vision and robot components may be adapted to other target points, including for example spraying points, pruning points, harvesting points, or any other target point that may be relevant to a horticultural or agricultural plant care or handling.
  • a desired target point may vary according to a specific plant type and specific implementation.
  • variation of a clipping point is contemplated.
  • the clipping point may be proximal to the highest point of a plant, higher than the uppermost node on the main stem. If the length of the main stem is short between two nodes, the clipping point can be selected below the highest node or axial.
  • leaves may be dense around the highest node, and some parts of the main stem may be covered by leaves. Thus recognizing the main stem and petiole is difficult. The different shapes and types of seedlings make the recognition process even harder.
  • the selection of the clipping point is a cognitive process that relies on heuristic information. Therefore, a desired clipping point may vary to suit a particular implementation, and the machine vision component may be configured and trained accordingly.
  • Various techniques may be used in machine vision to find a suitable stem-stake coupling point such as: Real-Time Point Recognition for Seedlings Using Kernel Density Estimators and Pyramid Histogram of Oriented Gradients, Real-time Point Localization on Plants using Featurebased Soft Margin SVM-PCA Method, and YOLO v8/vl0.
  • the machine vision component may analyze acquired image data using a feature descriptor.
  • the feature descriptor may be variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
  • Feature descriptors can be selected manually or automatically.
  • FMS correlation-based feature selection
  • Correlation is a statistical measure that expresses the strength of the relationship between two variables. A positive correlation occurs for variables that move in the same direction, and a negative correlation occurs when two variables move in opposite directions.
  • Correlation is often used to determine whether there is a cause-and- effect relationship between two variables and it is often used in machine learning to identify multicollinearity, that is when two or more predictor variables are highly correlated with each other.
  • Multicollinearity can impact the accuracy of predictive models and it is an important indicator of suitability of the variables selected for training.
  • extracting the correlation coefficient between two sets of stochastic variables is nontrivial, in particular, when canonical correlation analysis indicates such degraded correlations due to heavy noise contributions. Considering this fact, we avoided using the Fourier shape descriptor and SAR because of the heavy noise contributions.
  • the currently disclosed device, system and method can accommodate plant processing in any type of environment including, for example, horticulture, agriculture, outdoor farming, indoor greenhouse, and the like.
  • Embodiments disclosed herein, or portions thereof, can be implemented by programming one or more computer systems or devices with computer-executable instructions embodied in a non- transitory computer-readable medium. When executed by a processor, these instructions operate to cause these computer systems and devices to perform one or more functions particular to embodiments disclosed herein. Programming techniques, computer languages, devices, and computer-readable media necessary to accomplish this are known in the art.
  • a non-transitory computer readable medium embodying a computer program for processing plants may comprise: computer program code for acquiring image data of a plant with an image sensor; computer program code for analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; computer program code for identifying a target point in the plant based on the segmented image data; computer program code for determining distance/depth from the image sensor to the target point in the plant; and computer program code for generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant.
  • the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data.
  • the computer readable medium further comprises computer program code for applying a kernel density estimator to the segmented image data to determine the target point.
  • the computer readable medium further comprises computer program code for validating the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
  • the computer readable medium is a data storage device that can store data, which can thereafter, be read by a computer system.
  • Examples of a computer readable medium include readonly memory, random-access memory, CD-ROMs, magnetic tape, optical data storage devices and the like.
  • the computer readable medium may be geographically localized or may be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion.
  • Computer-implementation of the system or method typically comprises a memory, an interface and a processor.
  • the interface may include a software interface that communicates with an end-user computing device through an Internet connection.
  • the interface may also include a physical electronic device configured to receive requests or queries from a device sending digital and/or analog information.
  • the interface can include a physical electronic device configured to receive signals and/or data relating to the plant processing method and system, for example from an imaging sensor or camera or image processing device.
  • Any suitable processor type may be used depending on a specific implementation, including for example, a microprocessor, a programmable logic controller or a field programmable logic array.
  • any conventional computer architecture may be used for computer-implementation of the system or method including for example a memory, a mass storage device, a processor (CPU), a graphical processing unit (GPU), a Read-Only Memory (ROM), and a Random-Access Memory (RAM) generally connected to a system bus of data-processing apparatus.
  • Memory can be implemented as a ROM, RAM, a combination thereof, or simply a general memory unit.
  • Software modules in the form of routines and/or subroutines for carrying out features of the system or method can be stored within memory and then retrieved and processed via processor to perform a particular task or function. Similarly, one or more method steps may be encoded as a program component, stored as executable instructions within memory and then retrieved and processed via a processor.
  • a user input device such as a keyboard, mouse, or another pointing device, can be connected to PCI (Peripheral Component Interconnect) bus.
  • the software may provide an environment that represents programs, files, options, and so forth by means of graphically displayed icons, menus, and dialog boxes on a computer monitor screen. For example, any number of plant images or clipping device characteristics or robot arm characteristics may be displayed.
  • Computer-implementation of the system or method may accommodate any type of end-user computing device including computing devices communicating over a networked connection.
  • the computing device may display graphical interface elements for performing the various functions of the system or method, including for example display of a clipping device characteristic or a robot arm characteristic during a stem-stake coupling task.
  • the computing device may be a server, desktop, laptop, notebook, tablet, personal digital assistant (PDA), PDA phone or smartphone, and the like.
  • PDA personal digital assistant
  • the computing device may be implemented using any appropriate combination of hardware and/or software configured for wired and/or wireless communication. Communication can occur over a network, for example, where remote control of the system is desired.
  • the system or method may accommodate any type of network.
  • the network may be a single network or a combination of multiple networks.
  • the network may include the internet and/or one or more intranets, landline networks, wireless networks, and/or other appropriate types of communication networks.
  • the network may comprise a wireless telecommunications network (e.g., cellular phone network) adapted to communicate with other communication networks, such as the Internet.
  • the network may comprise a computer network that makes use of a TCP/IP protocol (including protocols based on TCP/IP protocol, such as HTTP, HTTPS or FTP).
  • a stem-to-stake clipping device comprising: a passively rotating spool supporting and supplying a roll of wire; a feeder wheel pushing the wire through an entrance of a wire guide and an exit of a wire guide; a cam co-rotationally coupled to the feeder wheel; a rotational actuator driving rotation of the feeder wheel and the cam; a bender positioned proximal to the exit of the wire guide, the bender providing a strike surface for curving the wire into a circular clip; a cutter positioned proximal to the exit of the wire guide, the cutter providing an edge for cutting the wire; a lever having a first end abutting the cam, a second end positioning the cutter and the bender, and an intermediate pivot point; the lever following the cam to move from a first pivot position aligning the bender strike surface with wire pushed through the exit end to a second pivot position sweeping the cutter edge across the exit end to cut the wire.
  • Example 2 The device of example 1, wherein a circumference of the cam is eccentric to a rotation point of the cam, the lever moving from a first pivot position to a second pivot position by following the circumference of the cam.
  • Example 3 The device of example 2, wherein the cam is a drop cam with a single projecting tab/tooth, and the lever maintains the first pivot position for a major portion of each cam rotation, and the lever moves to the second pivot position when following the single projecting tab/tooth.
  • Example 4 The device of example 3, wherein the lever moves to the second pivot position when following in sequence a base to an apex of the single projecting tab/tooth, and returns to the first pivot position when following in sequence the apex to the base of the single projecting tab/tooth.
  • Example 5 The device of any one of examples 1-4, wherein the intermediate pivot point of the lever is rotationally coupled to the wire guide.
  • Example 6 The device of any one of examples 1-5, wherein the cutter is mounted on the second end of the lever, and the bender is mounted on the cutter.
  • Example 7 The device of any one of examples 1-6, wherein the feeder wheel pulls the wire from the spool and pushes the wire through the wire guide.
  • Example 8 The device of any one of examples 1-7, wherein a central groove is circumferentially formed in the feeder wheel for receiving the wire, and a plurality of transverse grooves intersect the central groove, intersecting edges of each of the plurality of transverse grooves and the central groove forming a friction surface for engaging the wire.
  • Example 9 The device of any one of example 8, wherein an increase in the number of transverse grooves is positively correlated with the length of the wire pulled from the spool in a single rotation cycle of the feeder wheel.
  • Example 10 The device of example 9, wherein the length of the wire pulled from the spool is determined by
  • Example 11 The device of any one of examples 1-9, wherein a gap is formed in a circumference of the feeder wheel, the gap reducing frictional force of the feeder wheel on the wire, the gap rotating to face the wire in overlapping coordination with the second pivot position of the lever.
  • Example 12 The device of example 11, wherein the gap faces the wire simultaneously with the second pivot position.
  • Example 13 The device of example 11, wherein the feeder wheel is a planar disc and the gap is formed as a flattened portion of the circumference of the feeder wheel.
  • Example 14 The device of example 11, wherein the gap provides a smooth surface devoid of a central groove and devoid of a transverse groove.
  • Example 15 The device of any one of examples 1-14, wherein the wire guide is formed as body with a bore extending through the body, the bore defining a lumen having a diameter sized to receive the wire, the lumen communicative with a first open end and an opposing second open end, the first open end is the entrance of the wire guide and the second open end is the exit of the wire guide.
  • Example 16 The device of any one of examples 1-15, wherein the bender strike surface is adjustable and tunable to and a change in a position of the strike surface changes the curving of the wire and the size or shape of the circular clip.
  • Example 17 The device of any one of examples 1-15, wherein the first pivot position is variable in a single rotational cycle by the lever following a varied radius of the cam.
  • Example 18 The device of example 17, wherein variation of the first pivot position in the single rotation cycle changes the curving of the wire and produces a spiral shape/configuration of the circular clip.
  • Example 19 The device of any one of examples 1-18, wherein the rotational actuator is a servo motor, a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator.
  • Example 20 The device of any one of examples 1-19, further comprising a clamp comprising first and second opposing jaws rotationally mounted to a frame of the device at first and second clamp rotation points, respectively, the clamp aligned with the exit of the wire guide.
  • Example 21 The device of example 20, further comprising a linear actuator pivotably coupled to both of the first and second opposing jaws at a common third clamp rotation point, the linear actuator driving counter-rotation of first and second opposing jaws to circumferentially reduce an open space between the first and second opposing jaws.
  • Example 22 The device of example 21, wherein the linear actuator is a servo motor (or a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator) driving a pinion that engages a rack, an end of the rack coupled to the common third clamp rotation point.
  • the linear actuator is a servo motor (or a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator) driving a pinion that engages a rack, an end of the rack coupled to the common third clamp rotation point.
  • Example 23 The device of any one of examples 20-22, further comprising a camera mounted to the frame of the device, an orientation and a field of view of the camera configured to capture the clamp.
  • Example 24 The device of example 23, wherein the camera is a stereo camera comprising at least two lenses.
  • Example 25 The device of example 23 or 24, wherein the camera includes a light detection and ranging (LIDAR) sensor.
  • LIDAR light detection and ranging
  • Example 26 The device of example 23 or 24, wherein the camera includes a time-of-flight (TOF) sensor.
  • TOF time-of-flight
  • Example 27 The device of any one of examples 1-26, further comprising an optical sensor positioned proximal to a circumference of the feeder wheel, the optical sensor triggered by an indicator on the feeder wheel, the optical sensor sending a pulse signal after completion of each cycle of clipping.
  • Example 28 The device of any one of examples 1-27, further comprising a heater element coupled to the wire guide.
  • Example 29 The device of any one of examples 1-28, wherein the wire is a metal material that is copper, stainless steel, or any alloy thereof.
  • Example 30 The device of any one of examples 1-28, wherein the wire is a plastic material that is polyethylene, polyamide, polyester, any blend thereof, or any composite thereof.
  • Example 31 A method for processing plants, the method comprising: acquiring image data of a plant with an image sensor; analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identifying a target point in the plant based on the segmented image data; determining distance/depth from the image sensor to the target point in the plant; generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant.
  • Example 32 The method of example 31, wherein the image sensor is part of a camera.
  • Example 33 The method of example 32, wherein the camera is a stereo camera comprising at least two lenses.
  • Example 34 The method of any one of examples 31-33, wherein the image sensor is a light detection and ranging (LIDAR) sensor.
  • the image sensor is a light detection and ranging (LIDAR) sensor.
  • Example 35 The method of any one of examples 31-33, wherein the image sensor a time-of- flight (TOF) sensor.
  • TOF time-of- flight
  • Example 36 The method of any one of examples 31-35, wherein the acquired image data is transferred to a LAB color space.
  • Example 37 The method of any one of examples 31-36, wherein the machine vision component analyzes the acquired image data using a feature descriptor.
  • Example 38 The method of example 37, wherein the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data.
  • Example 39 The method of example 37 or 38, wherein the machine vision component is trained using labeled images and K-fold cross-validation.
  • Example 40 The method of any of examples 37-39, wherein the machine vision component comprises an artificial neural network.
  • Example 41 The method of any one of examples 37-40, wherein the feature descriptor is variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
  • Example 42 The method of any one of examples 37-41, wherein the feature descriptor is selected by an automated algorithm.
  • Example 43 The method of any one of examples 31-42, further comprising applying morphological image processing to the segmented image data.
  • Example 44 The method of example 43, wherein the morphological image processing outputs a boundary image data of the at least first anatomical structure of the plant.
  • Example 45 The method of example 43, wherein the morphological image processing outputs a skeleton image data of the at least first anatomical structure of the plant.
  • Example 46 The method of example 43, wherein the morphological image processing eliminates at least a second anatomical structure of the plant from the segmented image data.
  • Example 47 The method of any one of examples 31-46, further comprising applying a point density variation to the segmented image data to determine the target point.
  • Example 48 The method of any one of examples 31-46, further comprising applying a kernel density estimator to the segmented image data to determine the target point.
  • Example 49 The method of any one of examples 31-48, further comprising validating the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
  • Example 50 The method of example 49, further comprising normalizing the first principal orientation of the histogram gradient of the voxel by matching and correlating with a predetermined second principal orientation of a histogram gradient of a ground voxel encompassing a predetermined suitable target point.
  • Example 51 The method of any one of examples 31-50, further comprising projecting a coordinate map of the image sensor to a coordinate map of the robot.
  • Example 52 The method of example 51, wherein the controller receives real-time image data identifying the target point location, the controller receives real-time location data identifying a current location of the robot, and the controller generating and communicating the control signal to the robot to minimize a difference between the real-time location data and the target point location.
  • Example 53 The method of any one of examples 31-52, further comprising mounting an end-effector to the robot.
  • Example 54 The method of example 53, wherein the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake, or the end-effector is a spraying device for localized spraying of the plant, or the end-effector is a cutting device for localized pruning of the plant.
  • the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake
  • the end-effector is a spraying device for localized spraying of the plant
  • the end-effector is a cutting device for localized pruning of the plant.
  • Example 55 The method of any one of examples 31-54, further comprising mounting a sensor to the robot.
  • Example 56 The method of example 55, wherein the sensor is the image sensor, or the sensor is a localization sensor, or the sensor is a contact-force sensor, or the sensor is a thermal sensor.
  • Example 57 The method of any one of examples 31-56, wherein the target point is a clipping point for linking a stem of the plant to a supporting stake, or the target point is a spraying point for localized spraying of the plant, or the target point is a cutting point for localized pruning of the plant.
  • Example 58 The method of any one of examples 31-56, wherein the plant is grown inside a greenhouse.
  • Example 59 The method of any one of examples 31-56, wherein the plant is grown in an agricultural farming outdoor facility and is exposed to natural weather elements.
  • Example 60 The method of any one of examples 31-58, wherein the at least first anatomical structure of the plant is a stem, leaf, branch, flower, bud, node, or petiole.
  • Example 61 The method of any one of examples 31-59, wherein the at least first anatomical structure of the plant is a plurality of anatomical structures, and the acquired image data captures multiple nodes of a stem and leaves of the plant.
  • Example 62 A system for processing plants, the system comprising: a memory configured to store image data; an image sensor configured to acquire image data of a plant; a processor configured to: analyze the acquired image data with a machine vision component trained to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identify a target point in the plant based on the segmented image data; determine distance/depth from the image sensor to the target point in the plant; a controller configured to generate a control signal based on the distance/depth and communicate the control signal to a robot to position the robot at the target point in the plant.
  • a memory configured to store image data
  • an image sensor configured to acquire image data of a plant
  • a processor configured to: analyze the acquired image data with a machine vision component trained to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identify a target point in the plant based on the segmented image data; determine distance/depth from the image sensor to the target point in the plant; a
  • Example 63 The system of example 62, wherein the image sensor is part of a camera.
  • Example 64 The system of example 63, wherein the camera is a stereo camera comprising at least two lenses.
  • Example 65 The system of any one of examples 62-64, wherein the image sensor is a light detection and ranging (LIDAR) sensor.
  • the image sensor is a light detection and ranging (LIDAR) sensor.
  • Example 66 The system of any one of examples 62-64, wherein the image sensor a time-of- flight (TOF) sensor.
  • TOF time-of- flight
  • Example 67 The system of any one of examples 62-66, wherein the acquired image data is transferred to a LAB color space.
  • Example 68 The system of any one of examples 62-67, wherein the machine vision component analyzes the acquired image data using a feature descriptor.
  • Example 69 The system of example 68, wherein the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data.
  • Example 70 The system of example 68 or 69, wherein the machine vision component is trained using labeled images and K-fold cross-validation.
  • Example 71 The system of any of examples 68-70, wherein the machine vision component comprises an artificial neural network.
  • Example 72 The system of any one of examples 68-71, wherein the feature descriptor is variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
  • Example 73 The system of any one of examples 68-72, wherein the feature descriptor is selected by an automated algorithm.
  • Example 74 The system of any one of examples 62-73, wherein the processor is configured to apply morphological image processing to the segmented image data.
  • Example 75 The system of example 74, wherein the morphological image processing outputs a boundary image data of the at least first anatomical structure of the plant.
  • Example 76 The system of example 75, wherein the morphological image processing outputs a skeleton image data of the at least first anatomical structure of the plant.
  • Example 77 The system of example 74, wherein the morphological image processing eliminates at least a second anatomical structure of the plant from the segmented image data.
  • Example 78 The system of any one of examples 62-77, wherein the processor is configured to apply a point density variation to the segmented image data to determine the target point.
  • Example 79 The system of any one of examples 62-77, wherein the processor is configured to apply a kernel density estimator to the segmented image data to determine the target point.
  • Example 80 The system of any one of examples 62-79, wherein the processor is configured to validate the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
  • Example 81 The system of example 80, wherein the processor is configured to normalize the first principal orientation of the histogram gradient of the voxel by matching and correlating with a predetermined second principal orientation of a histogram gradient of a ground voxel encompassing a predetermined suitable target point.
  • Example 82 The system of any one of examples 62-81, wherein the processor is configured to project a coordinate map of the image sensor to a coordinate map of the robot.
  • Example 83 The system of example 82, wherein the controller receives real-time image data identifying the target point location, the controller receives real-time location data identifying a current location of the robot, and the controller generates and communicates the control signal to the robot to minimize a difference between the real-time location data and the target point location.
  • Example 84 The system of any one of examples 62-83, further comprising an end-effector mounted to the robot.
  • Example 85 The system of example 84, wherein the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake, or the end-effector is a spraying device for localized spraying of the plant, or the end-effector is a cutting device for localized pruning of the plant.
  • the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake
  • the end-effector is a spraying device for localized spraying of the plant
  • the end-effector is a cutting device for localized pruning of the plant.
  • Example 86 The system of any one of examples 62-85, further comprising a sensor mounted to the robot.
  • Example 87 The system of example 86, wherein the sensor is the image sensor, or the sensor is a localization sensor, or the sensor is a contact-force sensor, or the sensor is a thermal sensor.
  • Example 88 The system of any one of examples 62-87, wherein the target point is a clipping point for linking a stem of the plant to a supporting stake, or the target point is a spraying point for localized spraying of the plant, or the target point is a cutting point for localized pruning of the plant.
  • Example 89 The system of any one of examples 62-87, wherein the plant is grown inside a greenhouse.
  • Example 90 The system of any one of examples 62-87, wherein the plant is grown in an agricultural farming outdoor facility and is exposed to natural weather elements.
  • Example 91 The system of any one of examples 62-90, wherein the at least first anatomical structure of the plant is a stem, leaf, branch, flower, bud, node, or petiole.
  • Example 92 The system of any one of examples 62-91, wherein the at least first anatomical structure of the plant is a plurality of anatomical structures, and the acquired image data captures multiple nodes of a stem and leaves of the plant.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Environmental Sciences (AREA)
  • Botany (AREA)
  • Developmental Biology & Embryology (AREA)
  • Manipulator (AREA)

Abstract

Described herein is a stem-to-stake clipping device, comprising: a passively rotating spool supporting and supplying a roll of wire; a feeder wheel pushing the wire through an entrance of a wire guide and an exit of a wire guide; a cam co-rotationally coupled to the feeder wheel; a rotational actuator driving rotation of the feeder wheel and the cam; a bender positioned proximal to the exit of the wire guide, the bender providing a strike surface for curving the wire into a circular clip; a cutter positioned proximal to the exit of the wire guide, the cutter providing an edge for cutting the wire; a lever having a first end abutting the cam, a second end positioning the cutter and the bender, and an intermediate pivot point; the lever following the cam to move from a first pivot position aligning the bender strike surface with wire pushed through the exit end to a second pivot position sweeping the cutter edge across the exit end to cut the wire. Methods and systems for operating the device are also described.

Description

DEVICES, SYSTEMS AND METHODS FOR PROCESSING OR PROPAGATING PLANTS
BACKGROUND OF THE INVENTION
Field of the Invention
The present invention relates to agriculture and horticulture, and more particularly to tasks that typically involve handling or inspection of individual plants.
Description of the Related Art
Plant processing and propagation includes many labour intensive tasks, such as pruning, localized spraying, clipping, and the like. For example, clipping of plants is a time-consuming and laborious task. Clipping is the task of putting a rubber band or plastic clip around a plant’s main stem and a supporting structure, such as a stake at a particular point along the stem to provide additional support to the plant. The clipping task involves a human worker bending or kneeling on the floor while using two hands to stretch a rubber band or to put a plastic clip around the plant and the stake. This is a physically demanding and painstaking task, and most modem greenhouses still require a significant amount of manual labour to execute the clipping task. As an example, in one propagation facility (e.g., Roelands Plant Farms, Lambton Shores, ON., Canada), more than 25 million seedlings grow and are clipped per year. The sheer volume of propagated seedlings each year and associated labor costs and work-related injuries presents a significant problem in greenhouse seedling propagation.
Indoor farms and greenhouses across North America use vegetable and fruit seedlings produced in specialized propagation facilities . These facilities produce 140 billion seedlings each year for North American market alone. The clipping on each one of these seedlings is currently performed manually in all propagation facilities. This is a very laborious undertaking that can require a human worker to bend or kneel on the floor or reach across a wide table (typically 1.5 meter) to clip the plants. It is a physically demanding task that due to the awkward body posture, takes a high toll on the workers' body that can lead to back injuries. The clipping is necessary prior to loading the seedlings to a shipping truck and is the most time consuming part of the preparation of seedlings. There is a short time window to clip the seedlings and this creates a bottleneck in the entire process by significantly slowing down the process.
Accordingly, there is a need for alternative solutions to clipping of seedlings and plants, and more generally there is a need for alternative solutions to processing or propagating plants. SUMMARY OF THE INVENTION
In an aspect there is provided, a clipping device, comprising: a passively rotating spool supporting and supplying a roll of wire; a feeder wheel pushing the wire through an entrance of a wire guide and an exit of a wire guide; a cam co-rotationally coupled to the feeder wheel; a rotational actuator driving rotation of the feeder wheel and the cam; a bender positioned proximal to the exit of the wire guide, the bender providing a strike surface for curving the wire into a circular clip; a cutter positioned proximal to the exit of the wire guide, the cutter providing an edge for cutting the wire; a lever having a first end abutting the cam, a second end positioning the cutter and the bender, and an intermediate pivot point; the lever following the cam to move from a first pivot position aligning the bender strike surface with wire pushed through the exit end to a second pivot position sweeping the cutter edge across the exit end to cut the wire.
In another aspect there is provided, a method for processing plants, the method comprising: acquiring image data of a plant with an image sensor; analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identifying a target point in the plant based on the segmented image data; determining distance/depth from the image sensor to the target point in the plant; generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant.
In yet another aspect there is provided, a system for processing plants, the system comprising: a memory configured to store image data; an image sensor configured to acquire image data of a plant; a processor configured to: analyze the acquired image data with a machine vision component trained to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identify a target point in the plant based on the segmented image data; determine distance/depth from the image sensor to the target point in the plant; a controller configured to generate a control signal based on the distance/depth and communicate the control signal to a robot to position the robot at the target point in the plant.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1A shows a clipping device with a clamp in an open position.
Figure IB shows a clipping device with a clamp in an closed position.
Figure 2 shows wire feeding components of the clipping device.
Figure 3 A shows a first view of wire curving components of the clipping device.
Figure 3B shows a second view of the wire curving components.
Figure 3C shows examples of various configurations of curved wire clips.
Figure 4A shows wire cutting components of the clipping device.
Figure 4B shows interaction of wire cutting components so that a rotating cam pivots a wire cutting lever to execute a cutting motion of a cutter.
Figure 5A-5D show clamping components of the clipping device formed as first and second opposing claw shaped arms that are pivoted toward each other by a linear actuator to transition from an open position (Fig. 5A) through intermediate positions (Fig. 5B and 5C) to a closed position (Fig. 5D).
Figure 6 shows connection of wire feeding components, wire curving components, wire cutting components, and clamping components in absence of actuators and supporting structures.
Figure 7 shows a mathematical model as it applies to the transverse grooves and the pulling of the wire to the feeder.
Figure 8 shows the relationship between the relative location of the Bender to the Wire Feeder and the diameter of the clip.
Figure 9 shows two different profiles of the Bender. Changing a linear translational position of surface (a) and changing an angular rotational position of surface (b).
Figure 10 shows a schematic that demonstrates a change in Cam profile causing a different shaping of the Clip.
Figure 11 shows variant wire curving components of the clipping device modified to attach a heat coil to the wire guide to heat a plastic wire passing through a bore formed in the wire guide.
Figure 12 shows a block diagram illustrating a first variant method for handling plants including machine vision algorithms and robot motion algorithms. Figure 13 shows a block diagram illustrating a second variant method for handling plants including machine vision algorithms and robot motion algorithms.
Figure 14A shows a block diagram illustrating a third variant method for handling plants providing a more specific example of machine vision algorithms.
Figure 14B shows a block diagram illustrating a fourth variant method for handling plants providing a more specific example of machine vision algorithms - schematic of the real-time point localization using feature-based soft margin SVM-PCA method.
Figure 15 shows a block diagram illustrating a system map for handling plants including machine vision algorithms and robot motion algorithms.
Figure 16 shows the clipping device with a stereo camera installed on a robotic arm.
Figure 17 shows examples of clipping points identified by expert farmer selections.
Figure 18 shows color value of pixels plotted in four different color spaces; for plant recognition, the concentration of pixels with similar color values in LAB is better than in other color spaces.
Figure 19 shows schematic steps of the stem recognition using adaptive color image segmentation based on optimized LAB color space.
Figure 20 shows schematic steps of the wooden stake segmentation/recognition.
Figure 21 shows comparison of the histogram (left) and kernel density estimation (right) constructed using the same data. The dashed individual kernels make up the kernel density estimator.
Figure 22 shows schematic steps for computing the Principal Orientation of the Histogram Gradient.
Figure 23 shows schematic steps of a multi-stage point density method to identify the most suitable clipping point along the seedling’s main stem for different types of vegetables including peppers, tomatoes, and cucumbers.
Figure 24 shows comparison of plant recognition for three types of seedlings (pepper, tomato, and cucumber) after applying four comparator automatic adaptive segmentation methods and the presently disclosed adaptive segmentation based on feature descriptors (entropy and variance).
Figure 25 shows stem and stake recognition of pepper (1), cucumber (2), bell pepper (3), and tomato (4) seedlings after applying the adaptive color image segmentation based on feature descriptors (entropy and variance) and the hybridization of the Otsu method and median filter; different cameras were used to take images in different lighting conditions and backgrounds to check the robustness of the algorithm.
Figure 26 shows suggested clipping points after applying the multi-stage point density method; the stereo camera matches left and right images to find the distance of the clipping point from the clipping device and calculates the orientation of the clipping device related to the suggested clipping point.
Figure 27 shows suggested clipping points using multi-stage point density algorithm for samples 1 and 2; in sample 3, although a suitable clipping point could be identified on the seedling, the stake is behind a leaf and not accessible; sample 4 shows a case where neither the stem nor the stake was accessible.
Figure 28 shows the clipping device and stereo camera on a robotic arm; the stereo camera takes images from the seedling and stalk; after recognizing the suitable clipping point using the multi-stage point density method, the robot moves the clipping device near the recognized clipping point; and the clipping device makes a clip around the stem and stake.
Figure 29 shows an impedance controller block diagram of a servo motor in the clipping device.
Figure 30 depicts trajectory and torque profile of a servo motor in automated and hand-held clipping device platforms during a complete cycle of clip production.
Figure 31 shows a 3D renderings depicting (a) a potential schematic design of a robotic arm, and (b) a gantry system equipped with multiple robotic arms for the stem-stake coupling system, (c) schematic of a robotic stem-stake coupling system with two gantries and nine robotic arms, (d) photograph of a working robotic system with a single gantry and two robotic arms.
Figure 32 shows a block diagram of a robotic system that can include multiple gantries (1 to m) supporting varying numbers of robotic arms (1 to n) equipped with an automatic clipping device (ACD), a robotic control unit (RCU), and a stereo camera. This object-oriented design offers flexibility to adjust the number of gantries and arms as needed.
Figure 33 shows a graphical user interface (GUI) of the robotic stem-stake coupling system with fifteen distinct subsections.
Figure 34 shows evaluation results of five robotic arm configurations for the stem-stake coupling task based on eleven key parameters.
Figure 35 shows a 5-degree-of-freedom (5-DOF) robotic arm, custom-designed and fabricated to fulfill the requirements of an experimentally implemented robotic system; configuration and main components of the robotic arm are shown from the left (a) and right (b) perspectives.
Figure 36 shows a logic flowchart outline of the robotic arm shown in Fig. 35.
Figure 37 shows detailed analysis of the position, torque, and error for each joint in achieving the desired position of the robotic arm. This figure illustrates the discrepancies between the commanded and actual positions, along with the corresponding torque applied at each joint.
Figure 38 shows comprehensive examination of the angle, torque, and error for each joint in achieving the desired orientation of the ACD. This figure highlights the differences between the target and achieved joint angles, as well as the torque required for each joint to reach the specified orientation.
Figure 39 shows (a) a schematic representation of error calculations; (b) a box-and-whisker plot illustrating the robotic arm’s and machine vision’s accuracy in determining and reaching a specific point within the working space; and (c) shows accuracy and repeatability of the robotic arm in reaching a specific point in the working space - the black (darker colored) dots represent the accuracy and repeatability of the robotic arm alone and the blue (lighter colored) dots indicate the accuracy and repeatability of the integrated system using both machine vision and robotic arms.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
With reference to the drawings a clipping device and a method of identifying a clipping point and a method of automated or semi -automated control of the clipping device are described.
Fig. 1 shows a perspective view of the clipping device (CD) 10 and its various parts. The CD 10 makes a clip and places it around a wooden stake and the main stem of a seedling or a flower. The clip provides additional support to the plant and avoids damage during transportation. The working principle of the CD 10 is based on feeding a wire 21 or equivalent thin filament from a rotating wire feeder 12 through a rotating wheel (main feeder) 14 driven by an actuator. The wire 21 is pulled from the wire feeder 12 by the actuated rotation of main feeder 14 and is pressed against a feeder supporter 16. When the main feeder 14 rotates, it pulls the wire forward against the feeder supporter 16 and pushes the wire through an opening inside the wire guide 18 which defines a bore or channel shape with an input opening 19a proximal to the feeder supporter 16 surface that abuts wire 21 and an opposing output opening 19b proximal to a bender 20. The main feeder 14 pushes the wire through and out of the wire guide 18, more specifically out of output opening 19b, to strike the bender 20 (which is optionally configured as a tunable/ adjustable mechanism that is adjusted to change the diameter or shape of a wire clip) and the force of pushing the wire against a strike surface 81 of the bender 20 shapes the curling wire into a ring shape clip 22. A cutter 24 cuts the wire at the end of each cycle when a full clip is formed. A first actuator, such as an electric motor (Servo Motor 1) 26 or other types of actuators such as a pneumatic actuator, turns the rotating wheel of the main feeder 14 and as the wire 21 passes through the wire guide 18 the force applied by the bender 20 shapes the wire into the clip 22. The same actuator also drives the cutter 24 to cut the wire at the end of one cycle to release the clip 22 from the feeding wire. The components of wire feeder 12, main feeder 14, feeder supporter 16, wire guider 18, first actuator 26 are interconnected in a desired orientation to one another by connection to a frame 28.
By locating the clipping device near a suitable point near the stem of the plant, the wire can wrap around the stem and the wooden stake as the wire is shaped into a clip 22. The size and shape of the clips can be changed with respect to the seedling or plant. The wire can be fed continuously, with the bender 20 shaping the clip 22 and the cutter 24 releasing the clip 22, and therefore it is not necessary to use pre-made clips in a cartridge. Optionally, the bender 20 is tunable/adjustable, and a tuning screw 80 on the bender 20 allows tuning of the bender 20 to change an impact or strike point of the feeding wire 21 against the strike surface 81 of the bender 20 and therefore change the force applied to the wire - changing the applied force allows for changing the shape and diameter of the clip 22 as well as the number of overlaps.
The CD also includes a clamp comprising fist and second opposing claw-shaped arms (30, 32) that have a specialized shape. The claw-shaped arms are useful to bring the stem and wooden stake close to each other prior to the clipping. A second actuator such as an electric motor (Servo Motor 2) 34 or other types of actuators such as a pneumatic actuator is used to close (and open) the claw-shaped arms. Fig. 1A shows the claw-shaped arms in an open position, while Fig. IB shows a closed position. The specific shapes of the claw-shaped arms bring the stem and wooden stake closer as it is closed by the second actuator. The same actuator also brings the head or the clip forming portion of the clipping device (while closing the claw-shaped arms) to the appropriate position near the stem and wooden stake i.e., the second actuator translates frame 28 and its connected components relatively closer to the claw-shaped arms concurrent during closing, while translating frame 28 and its connected components relatively farther away from the claw-shaped arms during opening.
The claw shaped arms (30, 32) can be considered more generally as an example of a clamp, and the claw shaped arms (30, 32) are first and second opposing jaws (30, 32) of the clamp. The first and second opposing jaws are rotationally mounted to a clamp frame 40 of the device at first and second clamp rotation points (42, 44), respectively, the clamp aligned with the exit of the wire guide 18.
The second actuator 34 is pivotably coupled to both of the first and second opposing jaws (30, 32) at a common third clamp rotation point 46, the common third clamp rotation point located approximately equidistant from the first and second clamp rotation points, the second actuator 34 driving counter-rotation of first and second opposing jaws (30, 32) to circumferentially reduce an open space between the first and second opposing jaws (30, 32). More specifically, counter-rotation means that the first and second opposing jaws rotate in opposing directions such that one of the jaws rotates clockwise while the other jaw rotates counter clockwise. The second actuator 34 is coupled to the common third clamp rotation point 46 by a rack-and-pinion transmission. The second actuator 34 is directly connected to a pinion gear 48, and pinion gear 48 engages rack 50 which is attached in a fixed position in clamp frame 40. Rotational motion of pinion gear 48 along rack 50 translates a linear guide holder 52 relative to rack 50, and consequently also translates linear guide holder 52 relative to clamp frame 40 and also translates the common third clamp rotation point 46 relative to clamp frame 40. A linear arm holder 54 extending from the linear guide holder 52 has a proximal end connected to the linear guide holder 52 and a distal end pivotally connected to the common third clamp rotation point 46. Linear sliding of the linear guide holder 52 is aided by bushings 56 slidably engaging linear tracks 58 that are orientated parallel to rack 50. Translation of the linear guide holder 52 and its linear arm holder 54 apply force to translate the common third clamp rotation point 46 relative to clamp frame 40 and to produce counter rotational motion at the first and second clamp rotation points. The three clamp rotation points are linked by coupling to first and second slots (60, 62) formed within the opposing first and second jaws, respectively. Each slot extends from a first end proximal to a capturing surface of the jaw (ie., the claw surface that captures the stem) to a second end distal from the capturing surface and proximal to the linear guide holder 52 and its linear arm holder 54. The first and second slots cross over each other and the crossing point of the first and second slots provides for the coupling of the common third clamp rotation point. More specifically, the second/distal ends of each of the first and second slots are rotationally coupled to the clamp frame 40 at the first and second clamp rotation points (42, 44), while the crossing point of the slots is coupled to the distal end of the linear arm holder 54 forming the common third clamp rotation point 46.
The linear guide holder 52 is attached to frame 28, and therefore as linear guide holder 50 translates relative to clamp frame 40, frame 28 and its attached components also move linearly relative to clamp frame 40 (for example, comparing Fig. 1A to Fig. IB shows that as linear guide holder 52 moves linearly relative to clamp frame 40 and claw-shaped arms (30, 32), frame 28 moves with linear guide holder 52).
The materials used to feed the CD can be different depending on the need of the user. The feeding wire can be made of metals such as steel, copper, etc., or plastic material such as polyethylene or polyamide. If the plastic material is used, a heater is typically used for pre-heating the plastic material prior to being fed through the wire guide 18. The energy required to move the mechanisms of the CD can be from AC or DC sources as well as pneumatic or hydraulic actuators. There is an optical sensor 36 with corresponding trigger 38 or other equivalent sensor trigger coupled to the main feeder 14 wheel that sends a pulse after completing each cycle of clipping.
The CD can be used by farmers as a handheld device, or it can be installed on automatic machines or robots. If used as a hand-held device, a single button on the CD allows a farmer to use the CD manually to perform clipping. The manual CD could be used without claw-shaped arms (30, 32).
If the CD is installed on an automatic machine or robot, there is a place on the clipping device to add an imaging sensor or camera, such as stereo camera 70, which provides the necessary information for the automatic machine or robot to find the most suitable clipping point automatically and gives the automatic machine or robot sensory feedback related to the coordinates and orientation of the clipping device on the automatic machine or robot. A control board of the clipping device can connect to the automatic machine or robot to follow their commands. The control board can support different kinds of communication protocols such as CAN, I2C, RS232, and RS485 to connect a variety of devices both wired and wireless.
Fig. 2 shows wire feeding components of the CD. The main feeder 14 is connected to and driven by servo motor 1. The main feeder 14 is shaped as a wheel with a central groove 74 defined on the perimeter or circumference of the wheel. The central groove 74 provides a wire path in the circumferential center of the main feeder 14. In addition, there are a plurality of transverse grooves 76 (i.e., transverse to the central groove) defined on the circumference of the main feeder 14. These transverse grooves 76 make small indentations on the wire 21 to push it forward by the force created because of these indentations. The number of transverse grooves and the diameter of the main feeder determine the length of the clip 22. The greater number of transverse grooves results in more wire being pulled, thereby a longer clip length.
Fig. 3 shows wire curving components of the CD 10. There is a hole/bore (bore 19 communicatively extending from input opening 19a to output opening 19b) inside the wire guide 18 along the direction the wire is being pulled. This hole/bore facilitates the feed wire 21 following a direct linear path and reduces and preferably prevents any bending of the wire before it reaches the bender 20. The bender 20 applies a force to the wire 21 to bend it. The amount of the applied force depends on the pulling force of the main feeder. The surface and the angle of the profile of the bender 20 affect the direction of the applied force to the wire. The tuning screw 80 on the bender 20 tunes the direction of the applied force to the wire 21 by adjusting the relative position of the bender (more specifically, a bender strike surface 81 that provides an impact point with wire 21) with respect to the wire guide 18. The amount of applied force and its direction determine the shape and diameter (i.e., the curvature of the clip) as well as the number of wire overlaps in the clip 22. Different surface profiles of the bender can be used to make different clip shapes. The relationship between the relative position of the bender to the wire guide and the diameter of the clip is given in Fig. 7. The number of transverse grooves 76 and diameter of the main feeder 14 determine how fast the wire is pulled, thereby determining the length of the wire in each clip. The relationship between the number of transverse grooves and diameter of the main feeder and the length of a clip is given in Fig. 7.
Fig. 4A shows wire cutter components of the CD. Servo motor 1 rotates the main feeder 14 and cutter guide with cam 84 (referenced for brevity as cam 84) - the wheel of the main feeder 14 and the cam 84 are fixed together so as to move co-rotationally. The cam 84 abuts and engages a cutter lever with cam 86 at a first end and cutter 24 mounted on a second end (referenced for brevity as lever cam 86) throughout its rotation. The lever cam 86 is pivotally coupled to the wire guider 18, with a first end of the lever cam engaging the cam 84 and a second end of the lever cam forming the cutter 24 with the bender 20 mounted on top of the cutter 24. Wire 21 is pulled by capture within the circumferential central groove 74 and associated transverse grooves 76 during rotation of the main feeder 14. A resting gap 88 is formed as a flattened circumferential portion on the main feeder. The main feeder rotates about 320 degrees. After that, due to the resting gap on the main feeder, the wire is not pulled by the main feeder. The cam 84 is aligned with the resting gap 88. Therefore, synchronized with a cessation of pulling force due to the resting gap 88, the cam 84 pushes a lever cam 86 downward, causing the lever cam 86 to rotate, which causes the cutter 24 to cut the wire 21 and separate the clip 22 from the wire 21. Fig. 4B shows a resting position and a cutting position of the lever cam 86 and its associated cutter 24 throughout rotation of cam 84. For a majority of rotation of cam 84 (for example, approximately 320 degrees) the lever cam 86 and cutter 24 are biased towards a resting position, and when the resting gap 88 faces the wire 21 and the feeder supporter 16, the cam 84 engages the lever cam 86 to pivot the lever cam 86 and cutter 24 into a cutting position. The cam 84 and resting gap 88 are aligned, and the cam 84 and lever cam 86 are also shaped and configured to engage and then disengage within the resting gap 88 portion of the rotation of the main feeder 14 so that the lever cam 86 and its associated cutter 24 pivot to a cutting position and clear the cutting position to return to a resting position aligned with and synchronous with the rotational portion of the resting gap 88 temporarily ceasing pulling of wire 21.
Fig. 5 shows the Collector Mechanism/ clamping components of the CD. From resting/open position (Fig. 5A), the servo motor 2 pushes the linear arm holder 54 forward and closes the clawshaped arms. When the linear arm holder 54 comes forward, the claw-shaped arms close and bring together the main stem and wooden stake in the center of the claw-shaped arms. The linear arm holder 54 continues to move forward and close the claw-shaped arms completely (Fig. 5B and 5C). The arms hold tightly the stem and wooden stake and collect them in the center with minimal damage to the stem due to the special shape and profile of the claws (Fig. 5D). In the last step, all parts involved in making and cutting the clip (i.e., frame 28 and its attached components) move forward without closing the claw-shaped arms any further due to the shape of the first and second slots (60, 62) and profile of the claw-shaped arms (Fig. 5E). More specifically, first and second slots (60, 62) are shaped biphasic with first and second portions and symmetrically mirrored (first portions distal from the capturing surface of the claw shaped arms and second portions proximal to the capturing surface of the claw shaped arms) so that when the common third clamp rotation point 46 moves forward (in a direction distal to proximal to the capturing surface of the claw-shaped arms) along the corresponding first portion of the slots (60, 62) the claw-shaped arms converge or close by rotating towards each other or conversely when the common third clamp rotation point 46 moves backward (in a direction proximal to distal from the capturing surface of the claw-shaped arms) along this same first portion of the slots (60, 62) the claw shaped arms expand or open by rotating away from each other (Fig. 5A-5D), and when the common third clamp rotation point 46 moves along the corresponding second portion of the slots (60, 62) the claw-shaped arms remain in an unchanged/stationary position relative to each other (Fig. 5E). The stem and wooden stake stay in the center of the claw-shaped arms, while the main mechanism (i.e., frame 28 and its attached components) that makes and cuts the clip approaches the stem and wooden stake. This causes minimal interferences between the CD and plant/seedling leaves during the clipping process.
Fig. 6 shows the various components shown in Figs. 1-5 assembled without actuators and support structures for convenience of illustration of interaction of these components.
Fig. 7 shows a mathematical model as it applies to the transverse grooves and the pulling of the wire to the feeder, and modifying length of the clip according to equation
Figure imgf000014_0001
where L is the length of the Clip, n is the number of Transverse Grooves, in an illustrative example k_l coefficient for copper wire is k_l = 1 (eg., Imm/rad) , a is the angle of the Resting Gap, p_s is the coefficient of static friction that in an illustrative example is p_s=0.7, p_k is the coefficient of sliding friction that in an illustrative example is p_k=0.3, and r is radius of the main feeder wheel that in an illustrative example is r = 0.5. k_l is a material-dependent coefficient that is a weighting factor included in the mathematical model to be able to tune the output value.
Fig. 8 shows the relationship between the relative location of the bender 20 and its strike surface 81 to the feed wire 21 and the diameter of the clip.
Fig. 9 shows two different profiles of the bender. Changing the location of surface (a) and changing the angle of surface (b). Changing the location of the surface and the angle of the bender changes the direction and amount of force on the wire and the shape of the clip.
Fig. 10 shows how changing the cam profile causes the different shapes of the clip. The different diameter of the cam causes rotation of the lever cam which in turn results in changing the position of the bender. Changing the position of the bender will change the direction of the applied force on the wire. As a result, the clip has a helical shape.
Fig. 11 shows variant wire curving components of the clipping device modified to attach a heat coil to the wire guide to heat a plastic wire passing through a bore formed in the wire guide. A heater element such as a heating coil, heating bar and the like can be used to heat wire passing through the bore of the wire guide to make the wire more bendable, moldable, malleable, flexible, and the like for ease of reshaping wire that exits from the bore and strikes a bender surface. For example, the variant wire curving components shown in Fig. 11 provides a clipping device equipped with a plastic wire roll with a heating coil provided to heat the plastic wire. The clipping device may be equipped with a various materials for making a clip, for example materials such as copper, stainless steel, polyethylene, polyamide, polyesters or different kinds of plastic wires. To use plastic materials to make the clip, a heating coil can be coupled to the wire guide heat the plastic materials as it passes through the bore of the wire guide to make the plastic material more bendable, moldable, malleable, flexible, etc. so as to ease reshaping of the plastic material wire exiting the bore of the wire guide by the wire curving components to form a plastic material clip.
Fig. 12 shows a block diagram illustrating a first variant method for handling plants including machine vision algorithms and robot motion algorithms. Fig. 13 shows a block diagram illustrating a second variant method for handling plants including machine vision algorithms and robot motion algorithms.
Fig. 14A shows a block diagram illustrating a third variant method for handling plants providing a more specific example of machine vision algorithms.
Fig. 14B shows a block diagram illustrating a fourth variant method for handling plants providing a more specific example of machine vision algorithms - schematic of the real-time point localization using feature-based soft margin SVM-PCA method.
Fig. 15 shows a block diagram illustrating a system map for handling plants including machine vision algorithms and robot motion algorithms.
The currently disclosed devices, systems and methods for processing or propagating plants have been validated by experimental testing. Experimental testing results demonstrate the ability of the currently disclosed device, system and method to benefit plant processing in autonomous, semi- autonomous and manual modes. The following experimental examples are for illustration purposes only and are not intended to be a limiting description.
EXPERIMENTAL EXEMPLIFICATION. EXPERIMENTAL EXAMPLE 1.
A time-consuming and laborious tasks in facilities specializing in seedling propagation is the clipping of the new seedlings. Clipping is the task of putting a rubber band or plastic clip around the seedling’s main stem and a wooden stake at a particular point along the stem to provide additional support to the seedling and avoid damage during transportation.
The clipping task involves a human worker bending or kneeling on the floor while using two hands to stretch a rubber band or to put a plastic clip around the plant and the stake. This is a physically demanding and painstaking task, and most modem greenhouses still require a significant amount of manual labour to process a large number of seedlings propagation arena. As an example, in one propagation facility (e.g., Roelands Plant Farms, Lambton Shores, ON., Canada), more than 25 million seedlings grow and are clipped per year. The sheer volume of propagated seedlings each year vindicates the need and benefit of the development of a robotic solution for automated clipping of the seedling that has a real and measurable impact on the efficiency of the processes, productivity, and quality of the products, reduction of the labor costs associated with the clipping task, and the prevention of work-related injuries, such as back injuries, strains, and sprains associated with the awkward body position.
The robotic clipping solution contains two main parts; a mechatronic unit that performs the act of clipping and a vision unit that identifies the clipping points. The focus of this paper is on the vision unit, which replicates human visual functionalities and perception to identify a suitable clipping point along the seedling’s main stem for different types of vegetables including peppers, tomatoes, and cucumbers. Machine vision has been widely used to support precision agriculture by providing automated solutions for tasks that are traditionally performed manually. In recent years, the application of machine vision and different types of visual processing algorithms have evolved in some fields of agriculture {l/mavridou2019machine} such as spraying fertilizers and pesticides {2/berenstein2017automatic}, plant detection and harvesting, mapping of weed populations, grafting, irrigation, automatic grading, detecting plant diseases {3/liu2021plant}, identifying ripe fruits {4/kang2020real}, crop-weed classification {5/su2021data}, and cutting or pruning {6/kolmanic2021 algorithm}. The objective of the current study is to find the most suitable clipping point on a seedling. This is a challenging and time-consuming problem to be solved using image processing since it requires modeling the human's cognitive process and past inexpedience and training in identifying such points. We studied various approaches in the literature that can be potentially used to tackle the problem. Each approach has advantages and limitations regarding the specific problem we intend to solve. Some studies such as multi-feature patch-based segmentation {7/fan2021multi}, adaptive spectral-spatial gradient sparse regularization {8/zhang2020image}, adaptive snake algorithm model {9/shantkumari2021grape}, and kernel-based algorithm based on adaptive Kalman filter {10/shirzi2011 active} are based on more rigorous mathematical analysis and quantitative aspects of computer vision. The limitations of using these algorithms for recognizing the clipping point are that the leaves often occlude a significant portion of the foliage and other seedlings also cover some parts of the stems. In addition, lighting conditions and backgrounds vary in different greenhouses. The varying physical and environmental conditions make the identification of the clipping point a challenging problem, and it demands the adaptation of vision algorithms for reliable real-time results. Selecting a suitable clipping point is performed using heuristic knowledge and experience of human workers, which are difficult to code as part of the algorithm. The resurgence of feature-based methods using machine learning techniques and complex optimization frameworks provides a good candidate for modeling this process. Examples of such methods are optimized image registration and deep learning segmentation approach {l l/kerkech2020vine}, adaptive multi-vision technology {12/chen2020three}, and image fusion technology {13/li2021 recent} . There are large variations in terms of the type, size, shape, and pattern of the seedlings that decrease the performance of the machine learning techniques and optimization frameworks. In addition, there are only few salient features that make distinguishing the clipping point difficult using optimization techniques. In addition to the above challenges, algorithms based on the convolutional neural network (CNN) like multi-network fusion algorithm with transfer learning {14/bai2022multi} and point cloud using deep learning CNN {15/jayakumari2021 object} need hundreds of labeled images to train the network for each type of seedling and have large processing time {16/kolar2018transfer}. These challenges make the above-mentioned algorithms difficult to use and/or less robust in our application. To address these challenges, we propose to take advantage of a combination of algorithms to overcome the difficulties of finding the most suitable clipping point. As such, we have developed a multi-stage point density method as a hybrid approach based on analytical image processing and data-driven learning algorithms. The analytical part of the multi-stage point density method is based on the point density variation, kernel density estimation, the principal orientation of the histogram gradient, and normalized cross-correlation for matching. We applied the algorithm to hundred images of real seedlings from Roelands Plant Farms to evaluate the performance of our algorithm, we asked expert farmers to review and evaluate the results for accuracy.
MATERIALS AND METHODS. Our proposed multi-stage point density method uses the images taken by a stereo camera to find a suitable clipping point along the seedling’s main stem and stake. In the first step, an adaptive color image segmentation segments the plant based on the two feature descriptors (variance and entropy) of the pixels. An artificial neural network (ANN) tunes the cut-off points of the multiple auto-threshold for each pixel with respect to the variance and entropy around the pixel. To find the stem, the leaves are eliminated from the plant using morphological image processing techniques (hit-or-miss, thinning, and convex hull). A spline matching algorithm and hybridization of the Otsu method and median filter detect the wooden stake. Using the boundary and skeleton of the segmented plant, we then find the region of interest to limit the searching area, avoid unwanted points, and accelerate the process. The point density variation calculates the disparity of intensity of colors on a map. The kernel density estimator estimates the population of the finite data sample by smoothing the fundamental data. Using the principal orientation of the histogram gradient, and normalized cross-correlation for matching, the multi-stage point density method suggests a suitable clipping point. In the last step, the algorithm checks the accessibility of the wooden stake and stem for the robotic arm and clipping device and maps the stereo camera coordinate system to the robot coordinate system to provide necessary sensory feedback for the controller of the robot. To check the performance of the multi-stage point density method, we made and installed a mechatronic unit that includes the clipping device on a general purpose robotic arm (i.e., KUKA LWR IV). Our novel clipping device curls a thin wire to simultaneously make and attach the clip to the plant. An optimized stereo camera (Megapixel/USB5000W02M) has been placed on the clipping device to take images from the plants and send them to the vision algorithm. Fig. 16 shows the installed clipping device and the stereo camera used for evaluation purposes. An automated clipping system may include a plurality of specialized robotic arms equipped with such devices.
Suitable Clipping Point. Finding a suitable clipping point on the seedlings is the most imperative, challenging, and time-consuming task of the machine vision of the robotic clipping system. The clipping point can be on the highest point, higher than the uppermost node on the main stem. If the length of the main stem is short between two nodes, the clipping point is selected below the highest node or axial. However, the leaves are dense around the highest node, and some parts of the main stem are behind the leaves. Thus recognizing the main stem and petiole is difficult. The different shapes and types of seedlings make the recognition process even harder. The selection of the clipping point is a cognitive process that relies on heuristic information. Fig. 17 shows some seedlings and the preferred clipping points that expert farmers validated.
Stem Recognition. Comparing the colored histograms of the seedling images shows that the variation of the color channels for the stems of different seedlings is diffused. In addition, the lighting conditions are different, and the color combinations of the stems and leaves interfere with the background in some ranges which makes global segmentation impractical. Overcoming these issues involves creating an adaptive algorithm based on the adaptive LAB (also known as CIELAB) color space for segmenting the stem. Fig. 18 shows the separation of the color channels in different color spaces. As seen, the interference of the color coordinates of seedlings with other objects is less in the LAB color space than those in other color spaces such as RGB, YCbCr, or HSV. As a result, we use an adaptive color image segmentation in the LAB color space for stem recognition. To this end, we use an ANN to predict the optimized cut-off values for the locally adaptive threshold for each pixel base on entropy and variance around the pixel. Using this approach we can achieve better results in different lighting conditions and for different types and shapes of seedlings. The input features of the ANN are the variance and entropy of the sub-image around the pixel, and the output is the sub-range of the L, A, and B channels' cut-off values for the multilevel threshold.
The variance is a measure of variability and provides an indication of how the pixel values are spread. As shown below,
Figure imgf000018_0001
where p (mu) is the mean value and for each sub-image around the pixel can be obtained as
Figure imgf000019_0001
where H(i) = Tlf / N n_i is the number of pixels with a gray level of i, and N represents the total number of the pixels in the sub-image, and L is the maximum grey level.
The entropy measures the average uncertainty of the information source, defined as the corresponding states of the intensity level to which individual pixels can adapt. The higher the value of the entropy is, the more detailed the image is {17/deng2009entropy}. The entropy is defined as follows,
Figure imgf000019_0002
where all parameters are as defined before.
Reviewing images shows the non-linear relationship between the inputs i.e., the variance and entropy, and the output, the cut-off values for the multilevel thresholds. An ANN has a strong capability for representing complex, highly nonlinear relationships between input and output features. It has been shown that a network with only one hidden layer, but sufficient neurons, can express an arbitrary function {18/qi2019applying}. Thus, a three-layer architecture of an ANN consisting of the input layer, hidden layer, and output layer is used for obtaining optimized L, A, and B ranges {19/aasim2022machine}. We used labeled images from three types of seedlings (tomato, cucumber, and pepper) to train the ANN. The K-Fold cross-validation was used to estimate the skill of the machine learning model on the limited data sample we used. This technique systematically splits the available data into k-folds, fits the model on k-1 folds, evaluates it on the held-out fold, and repeats this process for each fold. Fig. 19 shows the block diagram of the stem recognition algorithm. After camera calibration using Zhang's method {20/zhang2000flexible}, the quality of the images was enhanced and restored using the pre-processing methods that were a combination of equalization techniques, high-boost filters, and morphological boundary enhancement {21/thapar2012study}. After segmenting the plant using adaptive color image segmentation, the morphological filtering techniques were used to remove noises from the segmented stem {22/ruchay2017impulsive}. Using the hit-and-miss, thinning, and convex -hull techniques, the leaves were then eliminated.
Wooden Stake Recognition. Recognizing the wooden stake inserted beside a seedling is more straightforward. The hybridization of the Otsu method and median filter {23/pacifico2018hybrid} can be used for stake recognition. Fig. 20 shows the schematic steps of the stake segmentation method. The wooden stake is almost vertically straight. Thus, hidden and covered parts of the stake can be found using simple partial spline matching.
After recognizing the stem and stake, we find the region of interest and use the point density variation, kernel density estimator, the principal orientation of the histogram gradient, and normalized cross-correlation for matching to recognize a suitable clipping point.
Region of Interest. To avoid unwanted points and accelerate the process of finding the most suitable clipping point, the multi-stage point density method uses the boundary and skeleton of the seedling to find the region of interest and limit the search area. The borders of the region of interest are computed using Equations (3), (4), and (5) below,
Figure imgf000020_0004
( ) and,
Figure imgf000020_0001
where S i and SJ are the mean values of the skeleton in the x and y directions for all non-null pixels, P_r, P l, and P t are the right, left, and top values of the seedling boundaries for non-null values, S(i) and S(j) are the values of the skeleton in pixel (i,j), and <|>_r, <j>_l, and <j>_t are the number of null values for right, left, and top of the boundary of the plant, respectively.
Point Density Variation. The point density variation shows the disparity of intensity of colors on a map {24/lawin2018density}. A Gaussian mixture model represents a distribution for each color channel i as,
Figure imgf000020_0002
where 7i_{i_{k} } are the mixing coefficients that meet the following condition,
Figure imgf000020_0003
The density P_i(x) is the Gaussian distribution of intensities in each color channel, and N(x|p_{i_{k}},Z_{i_{k} }) is the Gaussian density with the mean value, p_{i_{k} } and the variance E_{i_{k}}. We considered the Gaussian mixture model as a density estimator with diagonal covariance matrices of the form E_{i} = o_{i [A2 {25/ruckebusch2016resolving} yields,
Figure imgf000021_0001
where p i and E_i are the mean value and variance.
We applied the Gaussian distribution to each color channel of the segmented image and the mean of all three Gaussian distributions represents the point density variation of the seedling.
Kernel Density Estimator. In most computer vision and pattern recognition applications, the feature space is complex, noisy, and rarely can be described by the common parametric models, and non-parametric density estimation techniques have been widely used to analyze arbitrarily structured feature spaces {26/yang2003improved} . The kernel density estimator as a non-parametric density estimation technique calculates the density of features in a neighborhood around those features and the density function is estimated by a sum of kernel functions (typically Gaussians) centered at the data points {27/elgammal2002background} . A bandwidth associated with the kernel function is chosen to control the smoothness of the estimated densities and more data points allow a narrower bandwidth and a better density estimate, and the kernel density estimator spreads the known quantity of the population for each point out from the point location of random non-parametric variables {28/matioli2018new}. As a result, point density variation estimates the density of stem intensity, and kernel density estimation is a fundamental data smoothing technique, as shown in Fig. 21, that inferences about the population of the finite data sample {29/scaldelai2022multiclusterkde}. Assuming, X_l, X_2,..., X_n are independent and identically distributed points that have a univariate distribution with an unknown density f at any given point x, the kernel density estimator is defined as,
Figure imgf000021_0002
where K is the kernel, h > 0 is a smoothing parameter called the bandwidth, and
Figure imgf000022_0001
scaled kernel. As a rule-of-thumb
{30/chen2016comprehensive}, if the Gaussian basis function is used to approximate uni-variate data, the optimal choice for bandwidth is,.
Figure imgf000022_0002
Principal Orientation of the Histogram Gradient. To compare candidate features for the most suitable clipping point, we matched the Principal Orientation of the Histogram Gradient resulting from the point density variation and the kernel density estimator. The Principal Orientation of the Histogram Gradient represents a signature of the features of spatial regions {31/lauria2018nonparametric}, {32/wiangsamut2022fast}. We divided the region of interest into sub-images (voxels) with a size of about 1.5 times the stem's average thickness. The boundary of the stem was used to calculate the stem's average thickness {33/wang2020fruit}. This value can also be assigned by the user. For computing the histogram gradient of each voxel, we took the first quarter of the voxel and computed the gradient orientation in eight directions, and considered all different directions and the number of pixels within each direction. These steps were repeated for other quarters. The resulting orientation histograms were concatenated to create a long histogram gradient. The largest gradient direction was then selected as the principal of the long histogram gradient. Fig. 22 shows the main steps of computing the principal orientation of the histogram gradient. We used Equations (11) and (12) to calculate the size and angle of the principal of the long histogram gradient,
Figure imgf000022_0003
We normalized the principal orientation of all histogram gradients' voxels. Normalizing the histogram diminishes the effect of scaling and ignores the magnitude of the gradient. The ground voxels were selected based on ten valid clipping points on seedlings. We rotated the voxels whose principal orientation angles were different from the angles of the principal orientation of the ground voxels. The rotation ignores the effect of different orientations of the candidate voxels when we want to compare them with ground voxels. We selected the candidate voxels of the image if they had the same signature as the ground voxels. To select the best candidate, the normalized correlation metric was used to match the histograms, i.e.,
Figure imgf000023_0001
where
Figure imgf000023_0002
H2 are the his + tograms of the candidate voxel and ground voxels with the same normalized principals of long histogram gradient. The larger the distance metric, the better the match. A perfect match is when d(Hl,H2)=l.
Overall Approach. The multi-stage point density algorithm follows multiple steps to find the clipping point. Fig. 23A summarizes the step-by-step of the proposed algorithm for bell pepper seedlings. In the first row, after camera calibration and pre-processing, the image is transferred to the LAB color space, and the adaptive color image segmentation and hybridization of the Otsu method and median filter are applied to recognize the stem and stake. The boundary and skeleton of the plant using morphological image processing operations are obtained next. In row two, the region of interest is defined using the boundary and skeleton. A combination of recursive dilation and erosion and morphological techniques such as Hit and Miss, convex hull, and thinning are used to eliminate the leaves from the plant to find the stem. The third row contains the results of applying point density variation (Fig. 23B shows a magnification of the point density variation plot) and kernel density estimator for finding the most suitable clipping point on the stem. In row four, the stake is checked to determine whether there is a corresponding point on the stake. If accessible, the multi-stage point density method suggests the point and calculates the distance and depth using the images from the stereo camera {34/dandil2019computer}. Finally, the multi-stage point density algorithm calculates the most suitable orientation and position of the clipping device in the real coordinates of the robotic arm.
RESULTS AND DISCUSSION. We trained the ANN using the K-Fold cross-validation method with 180 labeled images from tomato, cucumber, and pepper seedlings. After training, the ANN had the ability to find the cut-off values for multiple thresholds for each pixel based on variance and entropy around the pixel and to recognize the stem of the seedlings in new real images.
To evaluate the accuracy of the adaptive segmentation algorithm based on feature descriptors (entropy and variance), we compared the results with four other automatic adaptive segmentation methods, including the Otso method, triangle algorithm, adaptive Gaussian threshold, and Li's minimum cross-entropy. Fig, 24 shows the results obtained using these algorithms.
To evaluate the accuracy of the segmentation algorithm, we calculated the mean square error (MSE) which is the square of the average error of each channel for 10 test samples of each seedling. To calculate MSE for two pictures (i.e. labeled image and segmented image), we calculated the square of the difference between every pixel in the labeled image and the corresponding pixel in the segmented image, added the results, and then divided the results by the number of pixels. Table 1 summarizes the average MSE for three types of seedlings. A smaller value of MSE indicates more similarity between labeled images and segmented images, and so better results for stem and stake recognition. Fig. 25 shows the segmented stems and stakes of three different plants using adaptive color image segmentation using feature descriptors and hybridization of the Otsu method and median filter.
TABLE 1: Average mean square error (MSE) for stem recognition for pepper, cucumber, and tomato seedlings.
Figure imgf000024_0001
After adaptive segmentation, we applied the multi-stage point density method on tree types of seedlings to find the correct position of the clipping points and evaluate the quality of the results. The first category contained 120 images of bell pepper seedlings, the second category had 80 images of tomatoes, and the third category contained 80 images of cucumbers. To check the efficacy of the algorithm, expert farmers evaluated the results. Table 2 shows the success rate of finding the suitable clipping point for each type of seedling. The leaves of cucumbers are big and access to the stem on top of the plant is difficult. Thus, the success rate for cucumbers is less than other seedlings. Fig. 26 shows the recognized clipping points on the seedlings using the multi-stage point density method.
TABLE 2: Success rate of finding the suitable clipping point for each seedling.
Figure imgf000024_0002
In some cases, the stem or stake or both could not be accessed, for example when they were behind leaves, or when the stem and stake were far from each other. In such cases, the algorithm could not suggest a suitable clipping point. As shown in Fig. 27, the algorithm could suggest a suitable clipping point in samples 1 and 2. Sample 3 presents a case where a suitable clipping point could be identified, but the stake was not accessible. If the distance between the stake and stem is more than an acceptable amount, the multi-stage point density algorithm identified the case as non- accessible by the clipping device. Finally, sample 4 shows a case where neither the stem nor the stake was accessible. A possible solution for such cases is to take other images from different angles or even the opposite side of the seedling.
We installed the clipping device and stereo camera on a KUKA robotic arm. The stereo camera took images and when the multi-stage point density method found the most suitable clipping point, the robotic arm moved the clipping device near the identified clipping point, and the clipping device clipped the seedling. Fig. 28 shows examples of seedlings with clips on them.
CONCLUSION. Experimental Example 1 proposes a new approach for finding the most suitable clipping point for a new robotic clipping system under development. The proposed approach is conceptually different from other feature detection methods in that it combines analytical image processing methods and data-driven learning algorithms. This allows us to solve the challenging problem of clipping point detection. We evaluated the algorithm using hundreds of real seedling images of peppers, tomatoes, and cucumbers from Roelands Plant Farms Inc. It was shown that the proposed adaptive segmentation was able to more reliably segment both stem and stake in each image. The success of our adaptive segmentation approach was in part due to the use of the variance and entropy of voxels as two effective features for tuning the local cut-off values. The final results of the algorithm, i.e., the identified clipping points, were verified by expert farmers to validate the efficacy of the algorithm. As a whole, the obtained results indicated satisfactory performance in finding the most suitable clipping point.
Further improvement of our algorithm is contemplated. For example, the results from adaptive segmentation showed that stem recognition could be difficult when the leaves cover a big part of the stem, or the pattern and color of the stem are similar to those of the leaves that are behind the stem. One solution for this problem is to use several labeled images to train an ANN or guess the stem position using rational laws such as the continuity of the stem. When the stem in both the right and left images is recognizable, our approach finds the clipping point perfectly. In other cases where the stem cannot be recognized in both images, one can obtain other images from different angles around the seedlings to address the issue. REFERENCES FOR EXPERIMENTAL EXAMPLE 1.
[1] E. Mavridou, E. Vrochidou, G. A. Papakostas, T. Pachidis, V. G. Kaburlasos, Machine vision systems in precision agriculture for crop farming, Journal of Imaging 5 (12) (2019) 89.
[2] R. Berenstein, Y. Edan, Automatic adjustable spraying device for site specific agricultural application, IEEE Transactions on Automation Science and Engineering 15 (2) (2017) 641-650.
[3] X. Liu, W. Min, S. Mei, L. Wang, S. Jiang, Plant disease recognition: A large-scale benchmark dataset and a visual region and loss reweighting approach, IEEE Transactions on Image Processing 30 (2021) 2003-2015.
[4] H. Kang, H. Zhou, X. Wang, C. Chen, Real-time fruit recognition and grasping estimation for robotic apple harvesting, Sensors 20 (19) (2020) 5670.
[5] D. Su, H. Kong, Y. Qiao, S. Sukkarieh, Data augmentation for deep learning based semantic segmentation and crop-weed classification in agricultural robotics, Computers and Electronics in Agriculture 190 (2021) 106418.
[6] S. Kolmani”c, D. Stmad, ” S. Kohck, B. Benes, P. Hirst, B. ” Zalik, An algorithm for automatic dormant tree pruning, Applied Soft Computing 99 (2021) 106931.
[7] P. Fan, G. Lang, P. Guo, Z. Liu, F. Yang, B. Yan, X. Lei, Multi-feature patch-based segmentation technique in the gray-centered rgb color space for improved apple target recognition, Agriculture 11 (3) (2021) 273.
[8] M. Zhang, S. Li, F. Yu, X. Tian, Image fusion employing adaptive spectral-spatial gradient sparse regularization in uav remote sensing, Signal Processing 170 (2020) 107434.
[9] M. Shantkumari, S. Uma, Grape leaf segmentation for disease identification through adaptive snake algorithm model, Multimedia Tools and Applications 80 (6) (2021) 8861-8879.
[10] M. A. Shirzi, M. Hairi-Yazdi, Active tracking using intelligent fuzzy controller and kernel-based algorithm, in: 2011 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE 2011), IEEE, 2011, pp. 1157 1163.
[11] M. Kerkech, A. Hafiane, R. Canals, Vine disease detection in uav multispectral images using optimized image registration and deep learning segmentation approach, Computers and Electronics in Agriculture 174 (2020) 105446.
[12] M. Chen, Y. Tang, X. Zou, K. Huang, Z. Huang, H. Zhou, C. Wang, G. Lian, Three- dimensional perception of orchard banana central stock enhanced by adaptive multi-vision technology, Computers and Electronics in Agriculture 174 (2020) 105508. [13] D. Li, Z. Song, C. Quan, X. Xu, C. Liu, Recent advances in image fusion technology in agriculture, Computers and Electronics in Agriculture 191 (2021) 106491.
[14] Y. Bai, Y. Guo, Q. Zhang, B. Cao, B. Zhang, Multi-network fusion algorithm with transfer learning for green cucumber segmentation and recognition under complex natural environment, Computers and Electronics in Agriculture 194 (2022) 106789.
[15] R. Jayakumari, R. R. Nidamanuri, A. M. Ramiya, Object-level classification of vegetable crops in 3d lidar point cloud using deep learning convolutional neural networks, Precision Agriculture 22 (5) (2021) 1617-1633.
[16] Z. Kolar, H. Chen, X. Luo, Transfer learning and deep convolutional neural networks for safety guardrail detection in 2d images, Automation in Construction 89 (2018) 58-70.
[17] G. Deng, An entropy interpretation of the logarithmic image processing model with application to contrast enhancement, IEEE Transactions on Image Processing 18 (5) (2009) 1135 1140.
[18] X. Qi, G. Chen, Y. Li, X. Cheng, C. Li, Applying neural -network based machine learning to additive manufacturing: current applications, challenges, and future perspectives, Engineering 5 (4) (2019) 721-729.
[19] M. Aasim, R. Katirci, O. Akgur, B. Yildirim, Z. Mustafa, M. A. Nadeem, F. S. Baloch, T. Karakoy, G. Yilmaz, Machine learning (ml) algorithms and artificial neural network for optimizing in vitro germination and growth indices of industrial hemp (cannabis sativa 1.), Industrial Crops and Products 181 (2022) 114801.
[20] Z. Zhang, A flexible new technique for camera calibration, IEEE Transactions on pattern analysis and machine intelligence 22 (11) (2000) 1330-1334.
[21] S. Thapar, S. Garg, Study and implementation of various morphology based image contrast enhancement techniques, Int. J. Comput. Bus. Res 128 (2012) 2229-6166.
[22] A. Ruchay, V. Kober, Impulsive noise removal from color images with morphological filtering, in: International Conference on Analysis of Images, Social Networks and Texts, Springer, 2017, pp. 280-291.
[23] L. D. Pacifico, T. B. Ludermir, L. F. Britto, A hybrid improved group search optimization and otsu method for color image segmentation, in: 2018 7th Brazilian Conference on Intelligent Systems (BRACIS), IEEE, 2018, pp. 296-301.
[24] F. J. Lawin, M. Danelljan, F. S. Khan, P.-E. Forss'en, M. Felsberg, Density adaptive point set registration, in: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2018, pp. 3829-3837. [25] C. Ruckebusch, Resolving spectral mixtures: with applications from ultrafast time- resolved spectroscopy to super-resolution imaging, Elsevier, 2016.
[26] C. Yang, R. Duraiswami, N. A. Gumerov, L. Davis, Improved fast gauss transform and efficient kernel density estimation, in: Computer Vision, IEEE International Conference on, Vol. 2, IEEE Computer Society, 2003, pp. 464-464.
[27] A. Elgammal, R. Duraiswami, D. Harwood, L. S. Davis, Background and foreground modeling using nonparametric kernel density estimation for visual surveillance, Proceedings of the IEEE 90 (7) (2002) 1151-1163.
[28] L. Matioli, S. Santos, M. Kleina, E. Leite, A new algorithm for clustering based on kernel density estimation, Journal of Applied Statistics 45 (2) (2018) 347-366.
[29] D. Scaldelai, L. Matioli, S. Santos, M. Kleina, Multiclusterkde: a new algorithm for clustering based on multivariate kernel density estimation, Journal of Applied Statistics 49 (1) (2022) 98-121.
[30] Y.-C. Chen, C. R. Genovese, L. Wasserman, A comprehensive approach to mode clustering, Electronic Journal of Statistics 10 (1) (2016) 210-241.
[31] S. Lauria, N. Dorudian, S. Swift, Nonparametric background modelling and segmentation to detect micro air vehicles (mav) using rgb-d sensor, SAGE Publications (2018).
[32] S. Wiangsamut, N. Eua-Anant, W. Phomphatcharaphong, Fast boundary extraction of color image using negative divergence of a normal compressive vector field, Engineering and Applied Science Research 49 (2) (2022) 168—180.
[33] Y. Wang, Y. Chen, Fruit morphological measurement based on three dimensional reconstruction, Agronomy 10 (4) (2020) 455.
[34] E. Dandil, K. K. C, EV IK, Computer vision based distance measurement system using stereo camera view, in: 2019 3rd International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), IEEE, 2019, pp. 1-4.
EXPERIMENTAL EXEMPLIFICATION. EXPERIMENTAL EXAMPLE 2.
Experiments describe a robotic device featuring a machine vision unit with a stereo camera and a mechatronic unit with two claw-shaped arms that is referred to as an automatic stem-stake coupling device (ACD). In addition, the stem-stake coupling device can serve as a hand-held device for semi-automatic coupling by growers; in this example the hand-held version of the stem-stake coupling device (HCD) neither has the claw-shaped arms nor the stereo camera. ACD and HCD utilize interconnected mechanisms to create clips of various sizes and shapes from metallic wire. These mechanisms include Pushing Mechanism, Curving Mechanism, Cutter Mechanism, and, for the ACD specifically, Collector Mechanism. Both devices operate on the principle of feeding a thin wire from the Feeding Wire Spool. The wire is pulled by the Main Feeder and is pressed against the Feeder Supporter. As the wire is pulled it moves through the Wire Guider while a Bender shapes it into a ring-shaped clip. At the end of this cycle, the Cutter cuts the wire when a full clip is formed. A first actuator, such as an electric motor (Servo Motor 1) turns the Main Feeder and drives the Cutter as well. The clipping mechanism incorporates an Optical Sensor that sends a pulse to indicate the completion of each cycle. By positioning the clipping device near the stem of the plant, the wire wraps around the stem and wooden stake. The size and shape of the clips can be adjusted based on the seedling or plant using a Tuning Screw to adjust the Bender. The ACD is equipped with a stereo camera and claw-shaped arms. The arms bring the stem and wooden stake into proximity before clipping. A second actuator, such as an electric motor (Servo Motor 2), rotates the claw-shaped arms, bringing the head of the clipping device closer to the stem and wooden stake and closing the arms. The ACD is integrated into a robotic system. A vision system utilizes stereo images to provide real-time information about the optimal orientation and position of the stem-stake coupling point, as well as the 3D spatial coordinates of the ACD, which are then transmitted to the robotic arm. In both the ACD and HCD, the impedance control method is employed to regulate the speed and torque of the servo motors based on the desired shape and size of the clips. Various materials can be used to produce the clips, including metals like steel and copper wire, as well as plastic materials such as polyethylene or polyamide wire. Among these choices, copper wire is often selected based on growers’ preferences. For plastic materials, a heater is required to preheat the material before feeding it through the Wire Guider.
Pushing Mechanism. The pushing mechanism’s role is to exert force and propel the wire forward at a specified velocity. The Main Feeder is connected to the Main Servo Motor (Servo Motor 1). The Main Feeder has a Central Groove around its perimeter, which ensures the wire stays centered as it moves. Additionally, there are Transverse Grooves perpendicular to the Central Groove on the Main Feeder’s perimeter. These Transverse Grooves create small indentations on the wire, propelling it forward as the Main Feeder rotates. The length of the clip depends on the diameter of the Main Feeder, the number of Transverse Grooves, and the arc length of the Resting Gap. This empirical model states the relationship between the mentioned parameters and the length of the clip may be expressed as Equation 1 discussed above with reference to Fig. 7. Curving Mechanism. The Curving Mechanism bends the wire into the desired shape. As the wire moves forward, it passes through a Wire Guider and then encounters a Bender. There is a hole/bore/channel inside the Wire Guider along the direction the wire is being pushed. This hole ensures that the wire follows a straight path and prevents any bending before it reaches the Bender. The Bender applies a normal force to the Wire to bend it. The normal force applied on the wire is proportional to the pulling force of the Main feeder. This force determines the shape and diameter (i.e., the curvature) of the clip, as well as the number of wire overlaps. The surface and the profile angle of the Bender affect the direction of the applied normal force to the wire. The Tuning Screw installed on the Bender allows tuning the relative position of the Bender due to the Wire Guider for adjusting the direction of the applied force. The surface profile of the Bender effects different clip shapes. Fig. 9 shows two examples of different mechanisms with different profiles that result in producing clips with different diameters. A first illustrative mechanism (Fig. 9B) involves the Rotational Mechanism with a flat surface of the Bender. Adjusting the Bender’s angle changes the orientation of its flat surface relative to the wire, thereby altering the force exerted on the wire and resulting in a different radius of the clip. For instance, the clockwise rotation of the Bender increases the force on the wire, resulting in smaller clip radii, and vice versa. Fig. 9A shows the Positional Mechanism in which the angle of the Bender does not change, instead the position of the Bender changes relative to the Wire Guider. In this mechanism, the surface of the Bender is curved. Thus, a vertical movement of the Bender alters the force applied to the wire, resulting in a change in the clip’s radius. The relationship between the clip’s radius and the Bender’s position is illustrated in Fig. 8. Additionally, Fig. 10 demonstrates how adjusting the Cam profile produces various clip shapes. In this mechanism, the Cam has different radii at different points. So, when Cam pushes Lever Cam, the rotational angle of the Lever Cam varies. The rotation of the Lever Cam changes the position of the Bender, thereby altering the amount of applied normal force on the Wire. As a result, the clip shape will be spiral. The benefit of the spiral clip lies in its ability to securely grasp the stem and stake, even when they are not closely positioned, particularly when the clip’s initial radius is relatively larger. In the final step of making the clip, the smaller radius of the clip tightens the stem and stake together.
Cutter Mechanism. Once the clip is created, the Cutter Mechanism detaches it from the wire. Both the ACD and HCD utilize a single servo motor to push, curve, and cut the wire through an integrated mechanism. The servo motor rotates the Main Feeder and a Cam at the end of each cycle. The Main Feeder rotates about 5.8 rad while pushing the wire forward and forming a clip. During the final 0.5 rad, the Main Feeder incorporates a Resting Gap to halt the wire feed before cutting. This allows the rotation of the Main Feeder and engages the Cutter via a Camshaft and Cam Lever, severing the wire.
Collector Mechanism. For automatic stem-stake coupling, the ACD is equipped with a unique collector mechanism that uses claw-shaped arms. Fig. 5A-5E illustrate the collector mechanism and multiple steps of closing the claw-shape arms and repositioning of the ACD’s head. A second servo motor, Servo Motor 2, causes the claw-shaped arms to close. The Arm Holder continues to move forward to fully close the claw-shaped arms to securely hold the stem and stake in the center of the arms with minimal damage, due to slots of the claw-shaped arms and shape of the claw-shaped arms. By closing these arms, the stem and stake are held together in the center of the arms. Additionally, the collector moves the ACD’s head closer to the stem and stake without further closing the claw-shaped arms before creating the clip, as shown in Fig. 5E. This is an advantageous step to prevent leaves from colliding with the ACD before the Collector Mechanism grabs them. Once the clipping is finished, Servo Motor 2 retracts the Arm Holder, causing the ACD’s head to move backward. This action opens the claw-shaped arms, releasing the plant with minimal interference to the leaves.
MOTOR CONTROL. The power for pushing, curving, and cutting the wire is produced by the Main Servo Motor in both HCD and ACD. The load on the servo motor may vary dynamically at each step. The rotational speed of the servo motor also influences both the shape and the quality of the clip. Therefore, as the servo motor maintains a consistent force in a specific angular velocity in the presence of dynamic external torques it benefits accurate bending of the wire. Hence, torque control is significant to achieve optimal performance and prevent stalling or overloading. Torque control allows the motor to adjust its output torque to compensate for changes in the load, ensuring repeatable motion control with consistent performance. The control objective of an impedance controller in our system is to impose a desired dynamic relationship between the servo motor’s position and the force of interaction with the wire. Impedance is defined as the ratio of the force to the position. Unlike position control, impedance control enables the motor to behave as a massspring-damper system, commanding the desired position in response to the interaction force of the servo motor and external factors. The desired dynamic relationship between the servo motor’s position and the interaction force can be expressed as, 50 + bdS0 + kd = fe
(15) where 80 = (0 - 0r), 0, and 0r, are the motor’s angular position and reference angular position, respectively, and m_d, b_d and k_d represent the desired inertia, damping, and stiffness of the system, respectively. In response to external force f_e, the impedance controller generates a modified position 80 as follows,
Figure imgf000032_0001
where s represents the Laplace Transform variable and all other parameters are as defined previously. The impedance controller remains stable as long as m_d, b_d and k_d are positive values. The intrinsic inertia of the Main Feeder is due to its mass, simplifying the choice of desired impedance to the selection of b_d and k_d. A higher damping coefficient ensures greater stability of the controller, while increased stiffness implies greater resistance of the Main Feeder against external torques. External torques result from pushing the wire forward, curling the wire, cutting it, or other unwanted torques on the motor due to unwanted twists in the fed wire. Fig. 29 depicts the block diagram of the proposed impedance controller. The outer loop naturally closes when the servo motor encounters external torques. Using the estimated torque feedback, the impedance function generates 80 and commands the desired angle 0d. The inner loop consists of a PID controller to track the desired trajectory for achieving suitable movement of the Main Feeder and, consequently, other parts such as the Bender and Cutter.
RESULTS. Both ACD and HCD are capable of producing clips in various shapes with the required diameters, including simple and spiral forms. As discussed in Equation (1), Fig. 7 illustrates the correlation between clip length and the number of Transverse Grooves. Adjusting the bender allows for changes in the diameter and shape of the clip.
The shape and quality of the clip are influenced by the torque generated by the main servo motor throughout the various stages of wire curling and cutting. Fig. 30 depicts the trajectory and torque profile of the Main Servo Motor across both the HCD and ACD platforms during a complete cycle of clip production. In Fig. 13, the ACD and HCD produce a clip in approximately 0.95 seconds. The impedance controller regulates the Servo Motor to exert accurate force on the wire, maintaining a specific trajectory with the desired angular velocity during clip production. The applied pushing force on the wire initiates from 0.08 seconds into the process and lasts until 0.72 seconds. Wire cutting commences at 0.8 seconds and concludes at 0.92 seconds. This visualization highlights how the introduced impedance controller influences motor performance, affecting the precision and consistency of clip formation. We evaluated both HCD and ACD to assess their effectiveness in performing the stem-stake coupling task. To assess the efficiency of the HCD, we considered both the acceptance rate of the clips made by the device on plants and seedlings, as well as the speed of the clipping task. To do this, growers were initially asked to perform the clipping task using pre-prepared plastic clips, which is the standard method in all propagation facilities. Subsequently, the same farmers were instructed to use the HCD to couple stem and stake for the same number of seedlings. We then compared the time taken and the quality of clipping between the two methods to evaluate the performance of the HCD. It was determined that producing each clip using the HCD requires approximately 0.95 seconds. Farmers spend extra time moving between seedlings, identifying the coupling point, and positioning the HCD near that point. Table 3 displays the average time taken and acceptance rate of clipping with the HCD in comparison to the standard method.
TABLE 3: The acceptance rate of the HCD and the comparison of the time required to apply a clip to the seedling, with and without the HCD, are evaluated across four different seedlings.
Figure imgf000033_0001
Research findings for gathering data for Table 3 indicated that farmers completed the stemstake coupling task 87% faster with the HCD, reducing the time per seedling from 4.3 seconds to 2.3 seconds. This was attributed to the device requiring fewer movements and less energy for each seedling, enabling farmers to work for longer periods without the risk of body injuries related to bending or kneeling over time. When working with plastic clips, farmers typically need to use both hands simultaneously. However, the HCD reduced the need for a second hand. Additionally, the HCD can create clips in different shapes depending on the type and size of the seedlings. This eliminates the need to purchase pre-made plastic clips and results in additional cost savings.
We also assessed the performance of automatic clipping using the ACD on a robotic system designed to fully automate the stem-stake coupling task. Initially, we installed an ACD on a general- purpose robotic arm, the KUKA LWR IV. Subsequently, we developed and enhanced a complete robotic system, the Robotic stem-stake coupling System (RSCS), specifically designed to handle the automatic stem-stake coupling task. The RSCS is a multi-task robotic solution developed to perform various tasks on seedlings and plants, including the clipping task. It was determined that producing each clip with the ACD takes roughly 0.95 seconds. Including the time for the claw-shaped arms to close and open, the average total cycle time is about 3.2 seconds. The robotic arm rotates around the seedling to locate an appropriate clipping point, recognizes this point, and positions the ACD correctly through the process that takes time. The average duration of these steps is detailed in Table 4 for four primary seedlings: Beit Alpha cucumber, chili pepper, bell pepper, and tomato. To assess the quality of the clips produced by the ACD, growers evaluated them. Table 4 also presents the acceptance rates of clip quality for each type of seedling. According to Table 4, two robotic arms equipped with ACD can achieve the speed of one grower.
TABLE 4: The average time required to complete a clip and the acceptance rate of clip quality produced by the ACD across four different seedlings.
Figure imgf000034_0001
For optimal clipping, the selected point should be the highest point, above the uppermost node on the main stem. Pepper seedlings typically have a straight stem with sufficient spacing between nodes, resulting in a higher acceptance rate for bell peppers compared to other seedlings. Tomato seedlings, on the other hand, have a relatively short main stem between nodes. Additionally, cucumber leaves tend to be denser around the highest node, which can occasionally lead to issues during clipping as some parts of the leaves may get caught between the two claw-shaped arms, slightly reducing the clip quality.
Growers utilizing HCD in propagation facilities and greenhouses can achieve approximately 86% greater efficiency compared to those employing plastic clips for stem-stake coupling task. Given the extensive number of clips required annually for plants and seedlings, HCD not only facilitates more efficient operations but also allows growers to allocate their time to other critical tasks. Additionally, HCD offers cost-saving benefits, further enhancing its value in greenhouse and propagation facility management. The HCD simplified the clipping task, allowing farmers to work longer with a lesser risk of back injuries due to chronic bad posture. In addition, it was found that automatic clipping in large propagation facilities using ACD accelerates the process for millions of seedlings and plants with fewer growers. The developed robotic stem-stake coupling System, featuring three gantries and 24 robotic arms equipped with ACDs, has the capacity to couple the stems of 12,000 bell pepper seedlings to stakes within an hour.
Both HCD and ACD utilize sustainable, eco-friendly materials, offering a viable alternative to traditional plastic clips. By using recyclable or biodegradable materials, these systems reduce environmental impact and support broader efforts to minimize waste and enhance sustainability in agricultural and horticultural practices.
EXPERIMENTAL EXEMPLIFICATION. EXPERIMENTAL EXAMPLE 3.
In propagation facilities, seedlings are cultivated using different methods. One style of cultivation is on the concrete floor, also known as the folding floor. Another style is cultivation on the tray system. In some cases, both styles are used, where seedlings are transferred from the concrete floor to the tray system semiautomatically using specialized equipment. To prepare the seedlings for clipping before transportation, other machinery in greenhouses, are used to rearrange the seedlings by automatically altering the spacing between them for easier access. Once rearranged, the tray system carrying the seedlings is passed in front of human workers, who affix a plastic clip around the seedlings and the wooden stake.
To facilitate the adoption of a new robotic solution, a solution benefits by meshing with existing technology used in propagation facilities, allowing for smooth integration with other automated machinery and devices while reducing disruption or cost increases. Therefore, the most recommended robotic solutions are those that can be installed directly where growers perform clippings.
Multiple methods may be employed to access seedlings for clipping tasks using a robotic system. Options include employing a mobile robot, utilizing a gantry to maneuver a robotic arm around seedlings, or employing a fixed robotic arm with a carrier transporting seedlings within its workspace. Each strategy offers advantages in terms of efficiency, adaptability, and precision in seedling handling. One strategy is a concept involving compact robotic arms with a restricted operational range. A mobile gantry system moves the arm towards the seedlings, while the gantry glides along the rails of the tray system. This setup facilitates the development of small, simple, and lightweight robotic arms.
An alternative approach is to mount the robotic arm to a fixed point on the ground in front of the tray stream. When the tray system is passed in front of the robotic arm, the arm can access and clip the seedlings similar to human workers. This approach simplifies operations and setup time for vision system operation. However, it relies on the moving tray system and cannot be easily adapted for clipping seedlings on the concrete floor.
Another approach is a mobile robot carrying the gantry and a robotic arm. This configuration eliminates reliance on moving trays and can be used for tray systems of various sizes as well as the concrete floor to provide access to seedlings. The robotic arm can operate autonomously among seedlings to perform the clipping task. The disadvantage of this solution is the difficulty in managing multiple such robots. Additionally, the robotic arm’s stability may be compromised, potentially causing issues due to vibrations.
Another concept involves a four-wheel mobile system carrying a robotic arm near the seedlings. This method offers advantages in flexibility and adaptation to both cultivation styles.
Given the scale of operation, the choice of a robotic solution can have significant cost implications for propagation facilities.
To accelerate the stem-stake coupling process, multiple robotic arms can be strategically positioned around the seedlings as shown in Fig. 4. Each robotic arm is equipped with a camera to determine the optimal stem-stake coupling point and assess the seedling from various angles. If a suitable point is identified, the clipping action is performed. If not, the other robotic arm attempts the task. The gantry-style solution is more appropriate for accommodating a multi-arm solution.
The limited space around seedlings on the tray system may impede access to seedlings. To address this, the spacing between seedlings on trays can be adjusted to enhance accessibility for the robotic arms. This re-spacing is a common practice in manual stem-stake coupling as well
Robotic System Framework. After evaluating various possibilities, it appears that adding a gantry system with multiple robotic arms onto the existing tray system is the optimal choice for the propagation facilities. This approach eliminates the need for modifying rails, and trays, or rearranging seedlings, reducing additional workload. While the approach is shown for the tray system it can be adapted to the concrete floor style. The number of gantries and robotic arms can be modified depending on the requirements of the facility and volume of seedlings, due to the object- oriented design of the robotic stem-stake coupling system. Fig. 3 IB illustrates a schematic representation of the robotic stem-stake coupling system, showcasing two gantries and nine robotic arms equipped with automatic clipping devices.
The robotic stem-stake coupling system (RSCS) has six major components, including a Master Control Panel (MCP), Gantry, Robotic Arm, Automatic Clipping Device (ACD), Robotic Control Unit (RCU), and machine vision System. Fig. 32 shows the interconnection of these components. Master Control Panel. The Master Control Panel (MCP) is a centralized interface enabling monitoring and control of key components within the robotic system. It acts as a central hub for coordinating operations, offering users access to essential controls, data, and functionalities. Access to the MCP is available through four distinct interfaces. The Graphical User Interface (GUI) is a visual interface that allows users to interact with the entire system through graphical icons and visual indicators. Fig. 33 shows different modules of the GUI and brief information of the numerically labelled GUI modules are as follows:
1] Incremental (Move Joints): The position and orientation of the ACD can be adjusted.
Each joint can move independently at varying speeds or synchronously at set speeds.
2] Absolute (Move Joints): The ACD’s position and orientation can be adjusted relative to the robot’s absolute coordinates, with joints moving independently or synchronously.
3] Go Forward: The ADC moves forward or backward at the desired speed while maintaining orientation.
4] Motor Joystick: The user can adjust the ACD’s position and orientation using buttons.
5] Sending Code: Users can send commands to control motors and monitor sensors.
6] Learn the Point: The robotic arm learns the CD’s position and orientation, whether moving or stationary.
7] Show/Go to the Point: Users can view the learned ACD’s position and direct the robotic arm to that.
8] Clipping Device: Indicating ACD status; users manage ACD at various control levels.
9] Motor/Sensors: Monitoring motors and sensors.
10] Machine Vision: Users can view stereo images and ML results, choose image processing methods, and direct the robotic arm to automate tasks with specified priorities, speeds, directions, and autonomy levels.
11] Main Buttons: Users select a port number, connect to the robotic arm, and access functions like emergency stop, reset, or exit the GUL
12] Program/Code: Users can input or modify code for compilation, execute it line by line (forward or backward), pause/resume execution, or halt at each step.
13] Control Tray: Interface with third-party devices.
14] Compiler Panel: Displaying the code and allowing users to track its compilation progress.
15] TCP/IP Connection: Enabling remote robotic control via an internet connection, accessible from devices like Android. The smartPAD serves as the interface for overseeing and managing the robotic system, equipped with touch-screen functionality and connectivity options such as wired or remote connections. This enables improved functionality and mobility of the robotic system. It assists the user in maneuvering around the trays and overseeing and managing the robotic system.
The TCP/IP offers the possibility of connecting the robotic system to the internet and controlling it remotely. This setup permits remote communication between the robotic system and the GUI, allowing farmers to send commands and receive feedback from anywhere with internet access. Farmers can observe the real-time status and performance of the robotic system via remote interfaces, which involves monitoring sensor data, tracking the robot’s location, and overseeing task progress. In addition to teleoperation, users can remotely control the robot’s movements and actions and can issue high-level commands to the robotic system, enabling it to autonomously execute predefined tasks. Remote access to robotic systems via TCP/IP facilitates maintenance tasks and diagnostics.
The developed Compiler allows farmers to write highlevel commands abstractly and intuitively, without needing to modify the low-level details of the robotic system’s hardware or communication protocols. Utilizing the compiler simplifies programming the robotic system and enhances accessibility for users of different technical skill levels. Users can remotely modify or enhance the functionality of the robotic system, while the system itself, equipped with high-level intelligence, can execute advanced commands in diverse conditions.
Gantry. In this example of a robotic system, Gantry refers to the large and rigid framework that supports and guides the movement of the robotic arms, linear guides, rails, lights, cable carrier chains, and other tools. The Gantry provides a stable and secure structure for mounting robotic components. It ensures that the components are properly aligned and supported during operation, minimizing vibrations and inaccuracies in movement. It is structurally robust to withstand the stresses associated with the dynamic movements and the weight of the robotic components as well as any payloads they are carrying. The Gantry framework ensures synchronized motion of the various components and includes feedback sensors to adjust the position of the robotic arms with precision. It is designed with flexibility, allowing for customization and adaptation to different environments in propagation facilities. To meet propagation facility and greenhouse needs, gantries can be installed on automated tray conveyance lines transporting seedling trays. The Gantry design avoids modifying conveyor lines during installation. Each Gantry can accommodate up to four robotic arms on each side. The Gantry’s specialized design and object-oriented programming allow for adding up to eight robotic arms without requiring hardware or software modifications. Robotic Arm. The robotic arm’s responsibility is to place the ACD at the suitable stem-stake coupling point, ensuring it is correctly oriented while maneuvering along a specific route to avoid collisions with surrounding seedlings, all to complete the task efficiently. Taking into account the need for the robotic system to be economically viable and meet the quality standards set by greenhouse owners, the design of the robot arm incorporates considerations for speed, affordability, reliability, and ease of maintenance. Hence, the robotic arm should afford simplicity, eliminating the necessity for robust and costly hardware for calculations of inverse kinematics, and path planning, as well as the control system. It should be lightweight and equipped with the features of an intelligent multi-agent system to facilitate communication with adjacent robotic arms. Choosing an optimal configuration for the robotic arm from a range of possibilities is advantageous to achieving a desired performance for the robotic system. In evaluating various robotic arm configurations for the stem-stake coupling task, we took into account eleven key parameters as follows:
1] Payload Capacity: The maximum weight the robotic arm can handle without compromising performance or safety.
2] Reachable Workspace: The portion of the arm’s workspace where it can access objects or points from certain orientations.
3] Dexterous Workspace: The area within the robotic arm’s workspace where it can reach points from various orientations, offers increased flexibility and versatility in manipulation tasks.
4] Reachability: The ability of the robotic arm to physically reach or access specific points or regions within its workspace.
5] Accessibility: The robotic arm can easily reach and operate within different areas of its workspace, considering obstacles, joint configurations, and potential collisions.
6] Accuracy: How closely the robotic arm can position objects to a specified target, ensuring precision in task execution.
7] Precision: The repeatability of the robotic arm’s positioning, ensuring consistent performance in manipulating objects.
8] Speed: The rate at which the robotic arm can move impacts efficiency and cycle time for completing tasks.
9] Vibration: The level of oscillatory motion or shaking exhibited by the robotic arm during operation, which can affect accuracy, precision, and performance.
10] Control System: the hardware and software components responsible for programming, monitoring, and executing tasks with the robotic arm, ensuring efficient operation and coordination of movements. 11] Computation Cost: The computational resources required to control and operate the robotic arm, including processing power, memory, and energy consumption.
Five different configurations of robotic arms were selected and evaluated to determine the most suitable one for the robotic system, as shown in Fig. 34. Among the presented robotic arm configurations, the PPPRRR configuration emerges as the most suitable option for the robotic system. The initial three prismatic joints primarily handle the positioning of the ACD, while the subsequent three revolute joints govern its orientation.
After observing how growers manually couple the stem seedlings with wooden stakes, we determined that the longitudinal axis, or roll, of their hands, remains relatively steady. Depending on the type of seedling, there is a slight variation in the lateral axis (pitch) of their hand, while the vertical axis (yaw) usually changes based on the individual characteristics of each seedling and the arrangement of its leaves. To investigate further, we installed the ACD on a KUKA LWR IV general -purpose robotic arm and evaluated its performance for the stem-stake coupling task, concluding that the PPPRR (5 -DOF) robotic arm is the ideal choice for our robotic system. Fig. 35 depicts the 5-degree-of-freedom (5-DOF) robotic arm, custom-designed and fabricated to fulfill the mentioned requirements of the robotic system. The robotic arm features three prismatic joints that adjust the position of the ADC relative to the main reference frame at the suitable clipping point. Additionally, there are two revolute joints responsible for orienting the ADC. The arm can handle payloads of up to 20 Newtons.
The Robot Control Unit (RCU) Controls the robotic arm and manages the movement of each joint to achieve desired configurations or trajectories, allowing the arm to perform its designated tasks effectively. The RCU utilizes the PID controllers to regulate the movement of individual joints, and it incorporates linear functions with parabolic blends (LSPB) for trajectory planning. The RCU ensures that the robotic arm maintains an accuracy of 0.1 mm and achieves a maximum speed of 400 mm/sec. Additionally, it guarantees the orientation accuracy of the ACD to be 0.2 degrees and enables a maximum angular velocity of 120 degrees/sec. Users can utilize a keypad to send commands to control the robot independently.
Fig. 36 presents the robotic arm’s logic flowchart, outlining the sequence of steps along with inputs, outputs, and loops. In propagation facilities, seedlings are arranged in a predetermined order on the tray, and the position of each seedling is accurately known on the tray with an accuracy of approximately 5 cm. The MCP evaluates the distance between the seedlings and the robotic arms to determine which seedlings should be clipped by each specific robotic arm. Based on the approximate position of the seedling on the tray and the robotic arm’s position in relation to the tray, the robot places the ACD near the desired seedling. Afterward, it follows the steps outlined in the flowchart to identify the appropriate point for stem-stake coupling. If obstructed by dense foliage, the robotic arm adjusts the ACD’s height or repositions it around the seedling to pinpoint a suitable location and finalize the coupling process. Alternatively, it may assign the coupling task to other robotic arms.
Machine Vision System. The ACD is equipped with a Megapixel/USB5000W02M stereo camera which captures images of the seedlings. These images are then transmitted to the machine vision System for further processing. The machine vision System enables each robotic arm to analyze images and find the most suitable stem-stake coupling point using various algorithms and techniques. The initial stage of the vision algorithm involves seedling recognition. The machine vision System employs an adaptive feature-based plant recognition technique to accurately segment the seedling from other elements present in the image. The algorithm utilizes four distinct techniques to determine the clipping point, considering factors like seedling type, environmental conditions such as lighting, seedling age, and the presence of other seedlings in the image. The first technique, the Height Method, is straightforward, as it finds the top of the seedling and identifies 3 to 5 centimeters below that point as the clipping point, depending on the type of the seedling. The second technique uses real-time point recognition using kernel density estimators and pyramid histogram of oriented gradients (KDE-PHOG) [M. Asadi Shirzi, M. R. Kermani, Real-time point recognition for seedlings using kernel density estimators and pyramid histogram of oriented gradients, in: Actuators, Vol. 13, MDPI, 2024, p. 81.]. Real-time point localization using featurebased soft margin SVM-PCA method (RTPL) [M. Asadi Shirzi, M. R. Kermani, Real-time point localization on plants using feature-based soft margin svm-pca method, in: IEEE Transactions on Instrumentation and Measurement, IEEE, under processing, 2024.] represents another rapid and precise method for identifying the most suitable stem-stake coupling point. YOLO-v8 [T. Han, T. Cao, Y. Zheng, L. Chen, Y. Wang, B. Fu, Improving the detection and positioning of camouflaged objects in yolov8, Electronics 12 (20) (2023) 4213], a deep learning algorithm, presents another option for identifying the stem-stake coupling point. In addition to automatic stem-stake coupling point recognition methods, users have the option to manually define the point on the image using Manual Cursor Selection. This allows users to move the cursor across the image using the mouse and click on the desired point, which is then designated as the appropriate coupling point.
Automatic Clipping Device. A novel automatic stem-stake coupling device (ADC) has been specifically designed and integrated into the robotic system. When the ADC gets alongside the seedling’s stem, the claw-shaped arms are closed to bring the stem and the stake together. A thin wire is then fed and guided through specialized components that shape it into a desired circular clip. Then the wire wraps around the stem and wooden stake. Finally, the Cutter trims the wire upon completion of the clip formation. The size and shape of the clips can be adjusted based on the seedling or plant. The stereo camera is installed on the ACD to provide real-time images for the machine vision System.
RESULTS. To assess the performance of the robotic system, we’ve developed a robotic stem-stake coupling system comprising one gantry and two robotic arms, which we installed on a specialized tray commonly used in propagation facilities for transporting seedlings. The robotic system performed the stem-stake coupling task on three types of seedlings commonly grown in greenhouses: cucumber, pepper, and tomato.
To assess the performance of the robotic system and the applied vision system, we computed the recall and precision metrics for the results obtained from four different algorithms utilized by the machine vision System for the three seedlings above. These data have been collected from 1000 attempts on three types of seedlings. Recall is defined as:
Recall = TP / (TP + FN) (17) and precision is defined as:
Precision = TP / (TP + FP) (18) where true positive (TP) represents the number of correctly detected stem-stake coupling points, false positive (FP) is the number of miss-detection points, and false negative (FN) is the number of false alarms. Recall serves as an indicator of false detection, measuring the system’s ability to identify relevant instances. Precision, conversely, reflects the success rate of correct detection, illustrating the system’s accuracy in identifying the desired outcomes. We enlisted the expertise of farmers to determine which clips corresponded to TP or FN. These results are presented in Table 5. As observed, the RTPL method exhibits superior performance compared to the other three methods. The success rate of identifying the stem-stake coupling point in pepper seedlings surpasses that of the other seedlings, while for cucumber seedlings is lower. This is attributed to the large leaves of the cucumber, which obscure much of the stem and stake, posing challenges in accurately recognizing the stem-stake coupling point.
TABLE 5: Comparing recall and precision, the two key metrics, for assessing the effectiveness of four algorithms employed by the machine vision system in finding the clipping point.
Figure imgf000043_0001
To assess the efficacy of the robotic system, we analyzed its success rate, which is determined by the ratio of successfully coupled seedlings to the total number of seedlings when a user determines the position of the stem-stake coupling point using Manual Cursor Selection. This metric evaluates the performance of the mechatronics components of the robotic system, namely the robotic arm and the ACD, under the assumption that the machine vision system can accurately select the coupling point without any errors. Table 6 shows the success rate of the mechatronics components. To assess the overall performance of both machine vision and mechatronics components when the RSCS operates fully automatically, we utilized the RTPL method in machine vision, which has demonstrated the best performance in recognizing the coupling point. Subsequently, we assessed the robotic system’s success rate, with the entire stem-stake coupling process automated at this stage. Table 7 shows the success rate for a fully automated stem-stake coupling task.
TABLE 6: The success rate of the mechatronics components of the robotic system when the user pinpoints the position of the stem-stake coupling point.
Figure imgf000043_0002
TABLE 7: The success rate of the robotic system in a completely automatic operation using the RTPL method.
Figure imgf000043_0003
The average time required for each coupling task is another important factor. Table 8 displays the numbers of seedlings per tray and the stem-stake coupling speed allocated by each grower across three distinct plants in a propagation facility arena (this data originates from the propagation facility at Roelands Plant Farms). Furthermore, Table 8 includes the stem-stake coupling speed for two distinct configurations of the robotic system. The initial setup (Pl) comprises a single robotic arm mounted on a Gantry, while the second configuration (P2) features twenty-four robotic arms distributed across three gantries.
TABLE 8: Comparing the stem-stake coupling speed between an expert farmer and a robotic arm in two configurations.
Figure imgf000044_0001
The stem-stake coupling speed of a singular robotic arm within the robotic system is lower compared to the setup where multiple robotic arms are situated closely together on gantries. Because robotic arms do not need to move much to reach the seedlings. The stem-stake coupling speed of a grower is approximately twice as fast as that of a robotic arm.
To better understand the limitations and potential areas for improvement in the robotic arm’s control algorithm, we evaluated the accuracy of the robotic arm’s joints in achieving the desired positioning and orientation within its working space. The results are depicted in Fig. 37 and Fig. 38, which include comprehensive data on torque, position, and orientation for both the desired and actual outcomes, as well as the associated errors. The figure clearly shows the discrepancies between the intended and achieved positions and orientations, providing insights into the performance of the system. The calibration of the robotic arm and the machine vision involves 100 randomly distributed positions within the robot’s working area. These positions have been matched with the machine vision calculations using stereo matching, while all robot axes are activated. The robotic arm’s positioning accuracy using the vision machine was tested by reaching 50 known positions within the robot’s working area. The robotic arm used machine vision calculations to reach these predetermined coordinates. After identifying and reaching the target point, we measured the error in the X, Y , and Z directions (e x, e_y, e_z), which are due to errors in both positioning and orientation of the ACD (Fig. 39a). Then we calculated the robotic arm’s cumulative error e_a, which is:
Figure imgf000045_0001
Fig. 39b displays the box-and-whisker plot illustrating the accuracy of the robotic arm and machine vision in determining and reaching the position and orientation of a specific point within the robotic arm’s working space. Fig. 39c illustrates the accuracy and repeatability of the robotic arm in reaching a specific point in the working space (black dots), as well as the performance of the integrated system where machine vision calculates the spatial coordinates of the point from images, and the robotic arm moves the ACD to the identified point (blue dots). Determining the spatial coordinates of a point through stereo matching and disparity reduces the accuracy of the integrated robotic arm and machine vision system.
The cumulative error across all 5 joints results in the robotic arm’s accuracy being within less than 1 mm. If the integrated system of machine vision and the robotic arm achieve positional accuracy within 10 mm, the claw-shaped arms of the ACD can correct for any discrepancies by precisely aligning the seedling’s stem and the stake at the center. This integrated system demonstrates a positional accuracy within 10 mm in 97.5% of instances.
These results validate an innovative robotic technology for automated stem-stake coupling tasks on seedlings using integrated machine vision capabilities. The proposed system, featuring three gantries and 24 robotic arms, has the potential to replace the work of 12 expert farmers in a propagation facility. With an average success rate of 89%, the system can identify unsuccessful attempts and flag them, allowing two additional farmers to complete the remaining coupling tasks. This effectively reduces the reliance on expert farmers from 12 to just 2. The RTPL technique, employed within the machine vision system, exhibits superior performance in accurately identifying the stem-stake coupling point compared to other techniques. Each coupling task costs less than 1.5 cents when executed by the robotic system, in contrast to the 3 cents incurred when completed manually, which includes the labor cost and the cost of clip itself. The cost reduction achieved through the use of ACD is attributed to its utilization of thin wire to produce clips, as opposed to using pre-made plastic clips as well the reduction of labor costs. This approach requires inexpensive materials and involves a clip-making process that consumes minimal energy. The use of metallic wire clips enhances environmental sustainability by reducing plastic waste.
Due to the object-oriented design of the robotic system, it is feasible to modify the configuration to enhance the stem-stake coupling speed. The robotic system is compatible with automated greenhouses and propagation facilities, requiring minimal alteration to their existing automatic lines.
The robotic system ensures maximum efficiency and productivity by allowing tha task to be carried out around the clock with consistent performance and reliability. Each of the 24 robotic arms consumes 40 watts of power, resulting in a total system power usage of less than 1 kilowatt. This demonstrates the energy efficiency of the robotic system, as it minimizes power consumption while maintaining optimal performance.
The integrated machine vision and robotic arm system demonstrate a positional accuracy within 10 mm for 97.5% of attempts, with the cumulative error across all five joints resulting in the robotic arm achieving an accuracy of less than 1 mm. This level of precision makes the robotic system suitable for various precision agricultural applications.
An illustrative version and several variants of devices, systems and methods for processing or propagating plants have been described above without any intended loss of generality. Further examples of modifications and variation are now provided. Still further variants, modifications and combinations thereof are contemplated and will be apparent to the person of skill in the art. It is to be understood that illustrative variants or modifications are provided for the purpose of enhancing the understanding of the person of skill in the art and are not intended as limiting statements.
For example, the currently disclosed device, system and method provide plant processing in autonomous, semi-autonomous and manual modes. Image sensor and machine vision components combined with a robot and robot controller provide autonomous or semi -autonomous of processing plants, and specifically identifying and effecting target points in plants. The clipping device as a hand-held device still provides a manual benefit as a hand-held clipping device/gun.
The clipping device used by either a fanner for semi-autonomous clipping or an automatic machine or a robot for fully autonomous clipping can have an immense impact on the efficiency of the processes, reduction of the labour costs associated with clipping tasks, and the prevention of work-related injuries, such as back injuries, strains, and sprains, associated with awkward body position for clipping.
The clipping device can accommodate a variety of materials for making a clip such as copper, stainless steel, polyethylene, polyamide, polyesters or many variety of plastic wires including thermoplastic or thermoset materials. Many plastic blends or composites or metal alloys may be used.
A clipping point is an example of a target point, and the machine vision and robot components may be adapted to other target points, including for example spraying points, pruning points, harvesting points, or any other target point that may be relevant to a horticultural or agricultural plant care or handling..
A desired target point may vary according to a specific plant type and specific implementation. For example, variation of a clipping point is contemplated. For example, the clipping point may be proximal to the highest point of a plant, higher than the uppermost node on the main stem. If the length of the main stem is short between two nodes, the clipping point can be selected below the highest node or axial. However, leaves may be dense around the highest node, and some parts of the main stem may be covered by leaves. Thus recognizing the main stem and petiole is difficult. The different shapes and types of seedlings make the recognition process even harder. The selection of the clipping point is a cognitive process that relies on heuristic information. Therefore, a desired clipping point may vary to suit a particular implementation, and the machine vision component may be configured and trained accordingly.
Various techniques may be used in machine vision to find a suitable stem-stake coupling point such as: Real-Time Point Recognition for Seedlings Using Kernel Density Estimators and Pyramid Histogram of Oriented Gradients, Real-time Point Localization on Plants using Featurebased Soft Margin SVM-PCA Method, and YOLO v8/vl0.
The machine vision component may analyze acquired image data using a feature descriptor. The feature descriptor may be variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof. Feature descriptors can be selected manually or automatically. To select effective feature descriptors for the adaptive feature-based plant recognition method (AFPR), we used correlation-based feature selection (CFS) to evaluate the appropriate correlation measure and a heuristic search strategy on datasets. Correlation is a statistical measure that expresses the strength of the relationship between two variables. A positive correlation occurs for variables that move in the same direction, and a negative correlation occurs when two variables move in opposite directions. Correlation is often used to determine whether there is a cause-and- effect relationship between two variables and it is often used in machine learning to identify multicollinearity, that is when two or more predictor variables are highly correlated with each other. Multicollinearity can impact the accuracy of predictive models and it is an important indicator of suitability of the variables selected for training. By reducing multicollinearity, we can improve the accuracy of predictions and prevent overfitting. We used seven feature descriptors after considering the Pearson correlation coefficient, checking the ratio among the covariance of features, and considering multicollinearity. Under heavy noise conditions, extracting the correlation coefficient between two sets of stochastic variables is nontrivial, in particular, when canonical correlation analysis indicates such degraded correlations due to heavy noise contributions. Considering this fact, we avoided using the Fourier shape descriptor and SAR because of the heavy noise contributions.
The currently disclosed device, system and method can accommodate plant processing in any type of environment including, for example, horticulture, agriculture, outdoor farming, indoor greenhouse, and the like.
Embodiments disclosed herein, or portions thereof, can be implemented by programming one or more computer systems or devices with computer-executable instructions embodied in a non- transitory computer-readable medium. When executed by a processor, these instructions operate to cause these computer systems and devices to perform one or more functions particular to embodiments disclosed herein. Programming techniques, computer languages, devices, and computer-readable media necessary to accomplish this are known in the art.
In an example, a non-transitory computer readable medium embodying a computer program for processing plants, may comprise: computer program code for acquiring image data of a plant with an image sensor; computer program code for analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; computer program code for identifying a target point in the plant based on the segmented image data; computer program code for determining distance/depth from the image sensor to the target point in the plant; and computer program code for generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant. In another related example, the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data. In still another related example, the computer readable medium further comprises computer program code for applying a kernel density estimator to the segmented image data to determine the target point. In yet another related example, the computer readable medium further comprises computer program code for validating the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
The computer readable medium is a data storage device that can store data, which can thereafter, be read by a computer system. Examples of a computer readable medium include readonly memory, random-access memory, CD-ROMs, magnetic tape, optical data storage devices and the like. The computer readable medium may be geographically localized or may be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion.
Computer-implementation of the system or method typically comprises a memory, an interface and a processor. The types and arrangements of memory, interface and processor may be varied according to implementations. For example, the interface may include a software interface that communicates with an end-user computing device through an Internet connection. The interface may also include a physical electronic device configured to receive requests or queries from a device sending digital and/or analog information. In other examples, the interface can include a physical electronic device configured to receive signals and/or data relating to the plant processing method and system, for example from an imaging sensor or camera or image processing device.
Any suitable processor type may be used depending on a specific implementation, including for example, a microprocessor, a programmable logic controller or a field programmable logic array. Moreover, any conventional computer architecture may be used for computer-implementation of the system or method including for example a memory, a mass storage device, a processor (CPU), a graphical processing unit (GPU), a Read-Only Memory (ROM), and a Random-Access Memory (RAM) generally connected to a system bus of data-processing apparatus. Memory can be implemented as a ROM, RAM, a combination thereof, or simply a general memory unit. Software modules in the form of routines and/or subroutines for carrying out features of the system or method can be stored within memory and then retrieved and processed via processor to perform a particular task or function. Similarly, one or more method steps may be encoded as a program component, stored as executable instructions within memory and then retrieved and processed via a processor. A user input device, such as a keyboard, mouse, or another pointing device, can be connected to PCI (Peripheral Component Interconnect) bus. If desired, the software may provide an environment that represents programs, files, options, and so forth by means of graphically displayed icons, menus, and dialog boxes on a computer monitor screen. For example, any number of plant images or clipping device characteristics or robot arm characteristics may be displayed.
Computer-implementation of the system or method may accommodate any type of end-user computing device including computing devices communicating over a networked connection. The computing device may display graphical interface elements for performing the various functions of the system or method, including for example display of a clipping device characteristic or a robot arm characteristic during a stem-stake coupling task. For example, the computing device may be a server, desktop, laptop, notebook, tablet, personal digital assistant (PDA), PDA phone or smartphone, and the like. The computing device may be implemented using any appropriate combination of hardware and/or software configured for wired and/or wireless communication. Communication can occur over a network, for example, where remote control of the system is desired.
If a networked connection is desired the system or method may accommodate any type of network. The network may be a single network or a combination of multiple networks. For example, the network may include the internet and/or one or more intranets, landline networks, wireless networks, and/or other appropriate types of communication networks. In another example, the network may comprise a wireless telecommunications network (e.g., cellular phone network) adapted to communicate with other communication networks, such as the Internet. For example, the network may comprise a computer network that makes use of a TCP/IP protocol (including protocols based on TCP/IP protocol, such as HTTP, HTTPS or FTP).
Still further variation can be illustrated in the following examples that are numerically and sequentially labelled for convenience of understanding (and the numerical ordering of the following examples is self-contained in this section and should not be confused with the numerical ordering of experimental examples 1-3).
Example 1. A stem-to-stake clipping device, comprising: a passively rotating spool supporting and supplying a roll of wire; a feeder wheel pushing the wire through an entrance of a wire guide and an exit of a wire guide; a cam co-rotationally coupled to the feeder wheel; a rotational actuator driving rotation of the feeder wheel and the cam; a bender positioned proximal to the exit of the wire guide, the bender providing a strike surface for curving the wire into a circular clip; a cutter positioned proximal to the exit of the wire guide, the cutter providing an edge for cutting the wire; a lever having a first end abutting the cam, a second end positioning the cutter and the bender, and an intermediate pivot point; the lever following the cam to move from a first pivot position aligning the bender strike surface with wire pushed through the exit end to a second pivot position sweeping the cutter edge across the exit end to cut the wire.
Example 2. The device of example 1, wherein a circumference of the cam is eccentric to a rotation point of the cam, the lever moving from a first pivot position to a second pivot position by following the circumference of the cam.
Example 3. The device of example 2, wherein the cam is a drop cam with a single projecting tab/tooth, and the lever maintains the first pivot position for a major portion of each cam rotation, and the lever moves to the second pivot position when following the single projecting tab/tooth.
Example 4. The device of example 3, wherein the lever moves to the second pivot position when following in sequence a base to an apex of the single projecting tab/tooth, and returns to the first pivot position when following in sequence the apex to the base of the single projecting tab/tooth.
Example 5. The device of any one of examples 1-4, wherein the intermediate pivot point of the lever is rotationally coupled to the wire guide.
Example 6. The device of any one of examples 1-5, wherein the cutter is mounted on the second end of the lever, and the bender is mounted on the cutter.
Example 7. The device of any one of examples 1-6, wherein the feeder wheel pulls the wire from the spool and pushes the wire through the wire guide.
Example 8. The device of any one of examples 1-7, wherein a central groove is circumferentially formed in the feeder wheel for receiving the wire, and a plurality of transverse grooves intersect the central groove, intersecting edges of each of the plurality of transverse grooves and the central groove forming a friction surface for engaging the wire.
Example 9. The device of any one of example 8, wherein an increase in the number of transverse grooves is positively correlated with the length of the wire pulled from the spool in a single rotation cycle of the feeder wheel.
Example 10. The device of example 9, wherein the length of the wire pulled from the spool is determined by
L = ki (27t-a) (ps + (Pk-Ps) e'009n) r where
L: Length of the wire n: Number of Transverse Grooves a Angle of a Resting Gap formed in the feeder wheel k wire material -dependent coefficient ps: Coefficient of static friction
(ip Coefficient of sliding friction r: radius of the feeder wheel.
Example 11. The device of any one of examples 1-9, wherein a gap is formed in a circumference of the feeder wheel, the gap reducing frictional force of the feeder wheel on the wire, the gap rotating to face the wire in overlapping coordination with the second pivot position of the lever.
Example 12. The device of example 11, wherein the gap faces the wire simultaneously with the second pivot position.
Example 13. The device of example 11, wherein the feeder wheel is a planar disc and the gap is formed as a flattened portion of the circumference of the feeder wheel.
Example 14. The device of example 11, wherein the gap provides a smooth surface devoid of a central groove and devoid of a transverse groove.
Example 15. The device of any one of examples 1-14, wherein the wire guide is formed as body with a bore extending through the body, the bore defining a lumen having a diameter sized to receive the wire, the lumen communicative with a first open end and an opposing second open end, the first open end is the entrance of the wire guide and the second open end is the exit of the wire guide.
Example 16. The device of any one of examples 1-15, wherein the bender strike surface is adjustable and tunable to and a change in a position of the strike surface changes the curving of the wire and the size or shape of the circular clip.
Example 17. The device of any one of examples 1-15, wherein the first pivot position is variable in a single rotational cycle by the lever following a varied radius of the cam.
Example 18. The device of example 17, wherein variation of the first pivot position in the single rotation cycle changes the curving of the wire and produces a spiral shape/configuration of the circular clip.
Example 19. The device of any one of examples 1-18, wherein the rotational actuator is a servo motor, a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator. Example 20. The device of any one of examples 1-19, further comprising a clamp comprising first and second opposing jaws rotationally mounted to a frame of the device at first and second clamp rotation points, respectively, the clamp aligned with the exit of the wire guide.
Example 21. The device of example 20, further comprising a linear actuator pivotably coupled to both of the first and second opposing jaws at a common third clamp rotation point, the linear actuator driving counter-rotation of first and second opposing jaws to circumferentially reduce an open space between the first and second opposing jaws.
Example 22. The device of example 21, wherein the linear actuator is a servo motor (or a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator) driving a pinion that engages a rack, an end of the rack coupled to the common third clamp rotation point.
Example 23. The device of any one of examples 20-22, further comprising a camera mounted to the frame of the device, an orientation and a field of view of the camera configured to capture the clamp.
Example 24. The device of example 23, wherein the camera is a stereo camera comprising at least two lenses.
Example 25. The device of example 23 or 24, wherein the camera includes a light detection and ranging (LIDAR) sensor.
Example 26. The device of example 23 or 24, wherein the camera includes a time-of-flight (TOF) sensor.
Example 27. The device of any one of examples 1-26, further comprising an optical sensor positioned proximal to a circumference of the feeder wheel, the optical sensor triggered by an indicator on the feeder wheel, the optical sensor sending a pulse signal after completion of each cycle of clipping.
Example 28. The device of any one of examples 1-27, further comprising a heater element coupled to the wire guide.
Example 29. The device of any one of examples 1-28, wherein the wire is a metal material that is copper, stainless steel, or any alloy thereof.
Example 30. The device of any one of examples 1-28, wherein the wire is a plastic material that is polyethylene, polyamide, polyester, any blend thereof, or any composite thereof.
Example 31. A method for processing plants, the method comprising: acquiring image data of a plant with an image sensor; analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identifying a target point in the plant based on the segmented image data; determining distance/depth from the image sensor to the target point in the plant; generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant.
Example 32. The method of example 31, wherein the image sensor is part of a camera.
Example 33. The method of example 32, wherein the camera is a stereo camera comprising at least two lenses.
Example 34. The method of any one of examples 31-33, wherein the image sensor is a light detection and ranging (LIDAR) sensor.
Example 35. The method of any one of examples 31-33, wherein the image sensor a time-of- flight (TOF) sensor.
Example 36. The method of any one of examples 31-35, wherein the acquired image data is transferred to a LAB color space.
Example 37. The method of any one of examples 31-36, wherein the machine vision component analyzes the acquired image data using a feature descriptor.
Example 38. The method of example 37, wherein the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data.
Example 39. The method of example 37 or 38, wherein the machine vision component is trained using labeled images and K-fold cross-validation.
Example 40. The method of any of examples 37-39, wherein the machine vision component comprises an artificial neural network.
Example 41. The method of any one of examples 37-40, wherein the feature descriptor is variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
Example 42. The method of any one of examples 37-41, wherein the feature descriptor is selected by an automated algorithm.
Example 43. The method of any one of examples 31-42, further comprising applying morphological image processing to the segmented image data.
Example 44. The method of example 43, wherein the morphological image processing outputs a boundary image data of the at least first anatomical structure of the plant.
Example 45. The method of example 43, wherein the morphological image processing outputs a skeleton image data of the at least first anatomical structure of the plant. Example 46. The method of example 43, wherein the morphological image processing eliminates at least a second anatomical structure of the plant from the segmented image data.
Example 47. The method of any one of examples 31-46, further comprising applying a point density variation to the segmented image data to determine the target point.
Example 48. The method of any one of examples 31-46, further comprising applying a kernel density estimator to the segmented image data to determine the target point.
Example 49. The method of any one of examples 31-48, further comprising validating the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
Example 50. The method of example 49, further comprising normalizing the first principal orientation of the histogram gradient of the voxel by matching and correlating with a predetermined second principal orientation of a histogram gradient of a ground voxel encompassing a predetermined suitable target point.
Example 51. The method of any one of examples 31-50, further comprising projecting a coordinate map of the image sensor to a coordinate map of the robot.
Example 52. The method of example 51, wherein the controller receives real-time image data identifying the target point location, the controller receives real-time location data identifying a current location of the robot, and the controller generating and communicating the control signal to the robot to minimize a difference between the real-time location data and the target point location.
Example 53. The method of any one of examples 31-52, further comprising mounting an end-effector to the robot.
Example 54. The method of example 53, wherein the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake, or the end-effector is a spraying device for localized spraying of the plant, or the end-effector is a cutting device for localized pruning of the plant.
Example 55. The method of any one of examples 31-54, further comprising mounting a sensor to the robot.
Example 56. The method of example 55, wherein the sensor is the image sensor, or the sensor is a localization sensor, or the sensor is a contact-force sensor, or the sensor is a thermal sensor.
Example 57. The method of any one of examples 31-56, wherein the target point is a clipping point for linking a stem of the plant to a supporting stake, or the target point is a spraying point for localized spraying of the plant, or the target point is a cutting point for localized pruning of the plant.
Example 58. The method of any one of examples 31-56, wherein the plant is grown inside a greenhouse.
Example 59. The method of any one of examples 31-56, wherein the plant is grown in an agricultural farming outdoor facility and is exposed to natural weather elements.
Example 60. The method of any one of examples 31-58, wherein the at least first anatomical structure of the plant is a stem, leaf, branch, flower, bud, node, or petiole.
Example 61. The method of any one of examples 31-59, wherein the at least first anatomical structure of the plant is a plurality of anatomical structures, and the acquired image data captures multiple nodes of a stem and leaves of the plant.
Example 62. A system for processing plants, the system comprising: a memory configured to store image data; an image sensor configured to acquire image data of a plant; a processor configured to: analyze the acquired image data with a machine vision component trained to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identify a target point in the plant based on the segmented image data; determine distance/depth from the image sensor to the target point in the plant; a controller configured to generate a control signal based on the distance/depth and communicate the control signal to a robot to position the robot at the target point in the plant.
Example 63. The system of example 62, wherein the image sensor is part of a camera.
Example 64. The system of example 63, wherein the camera is a stereo camera comprising at least two lenses.
Example 65. The system of any one of examples 62-64, wherein the image sensor is a light detection and ranging (LIDAR) sensor.
Example 66. The system of any one of examples 62-64, wherein the image sensor a time-of- flight (TOF) sensor.
Example 67. The system of any one of examples 62-66, wherein the acquired image data is transferred to a LAB color space.
Example 68. The system of any one of examples 62-67, wherein the machine vision component analyzes the acquired image data using a feature descriptor.
Example 69. The system of example 68, wherein the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data. Example 70. The system of example 68 or 69, wherein the machine vision component is trained using labeled images and K-fold cross-validation.
Example 71. The system of any of examples 68-70, wherein the machine vision component comprises an artificial neural network.
Example 72. The system of any one of examples 68-71, wherein the feature descriptor is variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
Example 73. The system of any one of examples 68-72, wherein the feature descriptor is selected by an automated algorithm.
Example 74. The system of any one of examples 62-73, wherein the processor is configured to apply morphological image processing to the segmented image data.
Example 75. The system of example 74, wherein the morphological image processing outputs a boundary image data of the at least first anatomical structure of the plant.
Example 76. The system of example 75, wherein the morphological image processing outputs a skeleton image data of the at least first anatomical structure of the plant.
Example 77. The system of example 74, wherein the morphological image processing eliminates at least a second anatomical structure of the plant from the segmented image data.
Example 78. The system of any one of examples 62-77, wherein the processor is configured to apply a point density variation to the segmented image data to determine the target point.
Example 79. The system of any one of examples 62-77, wherein the processor is configured to apply a kernel density estimator to the segmented image data to determine the target point.
Example 80. The system of any one of examples 62-79, wherein the processor is configured to validate the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
Example 81. The system of example 80, wherein the processor is configured to normalize the first principal orientation of the histogram gradient of the voxel by matching and correlating with a predetermined second principal orientation of a histogram gradient of a ground voxel encompassing a predetermined suitable target point.
Example 82. The system of any one of examples 62-81, wherein the processor is configured to project a coordinate map of the image sensor to a coordinate map of the robot.
Example 83. The system of example 82, wherein the controller receives real-time image data identifying the target point location, the controller receives real-time location data identifying a current location of the robot, and the controller generates and communicates the control signal to the robot to minimize a difference between the real-time location data and the target point location.
Example 84. The system of any one of examples 62-83, further comprising an end-effector mounted to the robot.
Example 85. The system of example 84, wherein the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake, or the end-effector is a spraying device for localized spraying of the plant, or the end-effector is a cutting device for localized pruning of the plant.
Example 86. The system of any one of examples 62-85, further comprising a sensor mounted to the robot.
Example 87. The system of example 86, wherein the sensor is the image sensor, or the sensor is a localization sensor, or the sensor is a contact-force sensor, or the sensor is a thermal sensor.
Example 88. The system of any one of examples 62-87, wherein the target point is a clipping point for linking a stem of the plant to a supporting stake, or the target point is a spraying point for localized spraying of the plant, or the target point is a cutting point for localized pruning of the plant.
Example 89. The system of any one of examples 62-87, wherein the plant is grown inside a greenhouse.
Example 90. The system of any one of examples 62-87, wherein the plant is grown in an agricultural farming outdoor facility and is exposed to natural weather elements.
Example 91. The system of any one of examples 62-90, wherein the at least first anatomical structure of the plant is a stem, leaf, branch, flower, bud, node, or petiole.
Example 92. The system of any one of examples 62-91, wherein the at least first anatomical structure of the plant is a plurality of anatomical structures, and the acquired image data captures multiple nodes of a stem and leaves of the plant.
Embodiments described herein are intended for illustrative purposes without any intended loss of generality. Still further variants, modifications and combinations thereof are contemplated and will be recognized by the person of skill in the art. Accordingly, the foregoing detailed description is not intended to limit scope, applicability, or configuration of claimed subject matter.

Claims

WHAT IS CLAIMED IS:
1. A stem-to-stake clipping device, comprising: a passively rotating spool supporting and supplying a roll of wire; a feeder wheel pushing the wire through an entrance of a wire guide and an exit of a wire guide; a cam co-rotationally coupled to the feeder wheel; a rotational actuator driving rotation of the feeder wheel and the cam; a bender positioned proximal to the exit of the wire guide, the bender providing a strike surface for curving the wire into a circular clip; a cutter positioned proximal to the exit of the wire guide, the cutter providing an edge for cutting the wire; a lever having a first end abutting the cam, a second end positioning the cutter and the bender, and an intermediate pivot point; the lever following the cam to move from a first pivot position aligning the bender strike surface with wire pushed through the exit end to a second pivot position sweeping the cutter edge across the exit end to cut the wire.
2. The device of claim 1, wherein a circumference of the cam is eccentric to a rotation point of the cam, the lever moving from a first pivot position to a second pivot position by following the circumference of the cam.
3. The device of claim 2, wherein the cam is a drop cam with a single projecting tab/tooth, and the lever maintains the first pivot position for a major portion of each cam rotation, and the lever moves to the second pivot position when following the single projecting tab/tooth.
4. The device of claim 3, wherein the lever moves to the second pivot position when following in sequence a base to an apex of the single projecting tab/tooth, and returns to the first pivot position when following in sequence the apex to the base of the single projecting tab/tooth.
5. The device of any one of claims 1-4, wherein the intermediate pivot point of the lever is rotationally coupled to the wire guide.
6. The device of any one of claims 1-5, wherein the cutter is mounted on the second end of the lever, and the bender is mounted on the cutter.
7. The device of any one of claims 1-6, wherein the feeder wheel pulls the wire from the spool and pushes the wire through the wire guide.
8. The device of any one of claims 1-7, wherein a central groove is circumferentially formed in the feeder wheel for receiving the wire, and a plurality of transverse grooves intersect the central groove, intersecting edges of each of the plurality of transverse grooves and the central groove forming a friction surface for engaging the wire.
9. The device of any one of claim 8, wherein an increase in the number of transverse grooves is positively correlated with the length of the wire pulled from the spool in a single rotation cycle of the feeder wheel.
10. The device of claim 9, wherein the length of the wire pulled from the spool is determined by
Figure imgf000060_0001
gs) e'009n) r where
L: Length of the wire n: Number of Transverse Grooves a: Angle of a Resting Gap formed in the feeder wheel kf wire material-dependent coefficient ps: Coefficient of static friction
Pi : Coefficient of sliding friction r: radius of the feeder wheel.
11. The device of any one of claims 1-9, wherein a gap is formed in a circumference of the feeder wheel, the gap reducing frictional force of the feeder wheel on the wire, the gap rotating to face the wire in overlapping coordination with the second pivot position of the lever.
12. The device of claim 11, wherein the gap faces the wire simultaneously with the second pivot position.
13. The device of claim 11, wherein the feeder wheel is a planar disc and the gap is formed as a flattened portion of the circumference of the feeder wheel.
14. The device of claim 11, wherein the gap provides a smooth surface devoid of a central groove and devoid of a transverse groove.
15. The device of any one of claims 1-14, wherein the wire guide is formed as body with a bore extending through the body, the bore defining a lumen having a diameter sized to receive the wire, the lumen communicative with a first open end and an opposing second open end, the first open end is the entrance of the wire guide and the second open end is the exit of the wire guide.
16. The device of any one of claims 1-15, wherein the bender strike surface is adjustable and tunable to and a change in a position of the strike surface changes the curving of the wire and the size or shape of the circular clip.
17. The device of any one of claims 1-15, wherein the first pivot position is variable in a single rotational cycle by the lever following a varied radius of the cam.
18. The device of claim 17, wherein variation of the first pivot position in the single rotation cycle changes the curving of the wire and produces a spiral shape/configuration of the circular clip.
19. The device of any one of claims 1-18, wherein the rotational actuator is a servo motor, a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator.
20. The device of any one of claims 1-19, further comprising a clamp comprising first and second opposing jaws rotationally mounted to a frame of the device at first and second clamp rotation points, respectively, the clamp aligned with the exit of the wire guide.
21. The device of claim 20, further comprising a linear actuator pivotably coupled to both of the first and second opposing jaws at a common third clamp rotation point, the linear actuator driving counter-rotation of first and second opposing jaws to circumferentially reduce an open space between the first and second opposing jaws.
22. The device of claim 21, wherein the linear actuator is a servo motor (or a stepper motor, an AC motor, a DC motor, a pneumatic actuator or a hydraulic actuator) driving a pinion that engages a rack, an end of the rack coupled to the common third clamp rotation point.
23. The device of any one of claims 20-22, further comprising a camera mounted to the frame of the device, an orientation and a field of view of the camera configured to capture the clamp.
24. The device of claim 23, wherein the camera is a stereo camera comprising at least two lenses.
25. The device of claim 23 or 24, wherein the camera includes a light detection and ranging (LIDAR) sensor.
26. The device of claim 23 or 24, wherein the camera includes a time-of-flight (TOF) sensor.
27. The device of any one of claims 1-26, further comprising an optical sensor positioned proximal to a circumference of the feeder wheel, the optical sensor triggered by an indicator on the feeder wheel, the optical sensor sending a pulse signal after completion of each cycle of clipping.
28. The device of any one of claims 1-27, further comprising a heater element coupled to the wire guide.
29. The device of any one of claims 1-28, wherein the wire is a metal material that is copper, stainless steel, or any alloy thereof.
30. The device of any one of claims 1-28, wherein the wire is a plastic material that is polyethylene, polyamide, polyester, any blend thereof, or any composite thereof.
31. A method for processing plants, the method comprising: acquiring image data of a plant with an image sensor; analyzing the acquired image data with a machine vision component to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identifying a target point in the plant based on the segmented image data; determining distance/depth from the image sensor to the target point in the plant; generating a control signal based on the distance/depth and sending the control signal from a controller to position a robot at the target point in the plant.
32. The method of claim 31, wherein the image sensor is part of a camera.
33. The method of claim 32, wherein the camera is a stereo camera comprising at least two lenses.
34. The method of any one of claims 31-33, wherein the image sensor is a light detection and ranging (LIDAR) sensor.
35. The method of any one of claims 31-33, wherein the image sensor a time-of-flight (TOF) sensor.
36. The method of any one of claims 31-35, wherein the acquired image data is transferred to a LAB color space.
37. The method of any one of claims 31-36, wherein the machine vision component analyzes the acquired image data using a feature descriptor.
38. The method of claim 37, wherein the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data.
39. The method of claim 37 or 38, wherein the machine vision component is trained using labeled images and K-fold cross-validation.
40. The method of any of claims 37-39, wherein the machine vision component comprises an artificial neural network.
41. The method of any one of claims 37-40, wherein the feature descriptor is variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
42. The method of any one of claims 37-41, wherein the feature descriptor is selected by an automated algorithm.
43. The method of any one of claims 31-42, further comprising applying morphological image processing to the segmented image data.
44. The method of claim 43, wherein the morphological image processing outputs a boundary image data of the at least first anatomical structure of the plant.
45. The method of claim 43, wherein the morphological image processing outputs a skeleton image data of the at least first anatomical structure of the plant.
46. The method of claim 43, wherein the morphological image processing eliminates at least a second anatomical structure of the plant from the segmented image data.
47. The method of any one of claims 31-46, further comprising applying a point density variation to the segmented image data to determine the target point.
48. The method of any one of claims 31-46, further comprising applying a kernel density estimator to the segmented image data to determine the target point.
49. The method of any one of claims 31-48, further comprising validating the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
50. The method of claim 49, further comprising normalizing the first principal orientation of the histogram gradient of the voxel by matching and correlating with a predetermined second principal orientation of a histogram gradient of a ground voxel encompassing a predetermined suitable target point.
51. The method of any one of claims 31-50, further comprising projecting a coordinate map of the image sensor to a coordinate map of the robot.
52. The method of claim 51, wherein the controller receives real-time image data identifying the target point location, the controller receives real-time location data identifying a current location of the robot, and the controller generating and communicating the control signal to the robot to minimize a difference between the real-time location data and the target point location.
53. The method of any one of claims 31-52, further comprising mounting an end-effector to the robot.
54. The method of claim 53, wherein the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake, or the end-effector is a spraying device for localized spraying of the plant, or the end-effector is a cutting device for localized pruning of the plant.
55. The method of any one of claims 31-54, further comprising mounting a sensor to the robot.
56. The method of claim 55, wherein the sensor is the image sensor, or the sensor is a localization sensor, or the sensor is a contact-force sensor, or the sensor is a thermal sensor.
57. The method of any one of claims 31-56, wherein the target point is a clipping point for linking a stem of the plant to a supporting stake, or the target point is a spraying point for localized spraying of the plant, or the target point is a cutting point for localized pruning of the plant.
58. The method of any one of claims 31-56, wherein the plant is grown inside a greenhouse.
59. The method of any one of claims 31-56, wherein the plant is grown in an agricultural farming outdoor facility and is exposed to natural weather elements.
60. The method of any one of claims 31-58, wherein the at least first anatomical structure of the plant is a stem, leaf, branch, flower, bud, node, or petiole.
61. The method of any one of claims 31-59, wherein the at least first anatomical structure of the plant is a plurality of anatomical structures, and the acquired image data captures multiple nodes of a stem and leaves of the plant.
62. A system for processing plants, the system comprising: a memory configured to store image data; an image sensor configured to acquire image data of a plant; a processor configured to: analyze the acquired image data with a machine vision component trained to recognize and segment at least a first anatomical structure of the plant to output segmented image data; identify a target point in the plant based on the segmented image data; determine distance/depth from the image sensor to the target point in the plant; a controller configured to generate a control signal based on the distance/depth and communicate the control signal to a robot to position the robot at the target point in the plant.
63. The system of claim 62, wherein the image sensor is part of a camera.
64. The system of claim 63, wherein the camera is a stereo camera comprising at least two lenses.
65. The system of any one of claims 62-64, wherein the image sensor is a light detection and ranging (LIDAR) sensor.
66. The system of any one of claims 62-64, wherein the image sensor a time-of-flight (TOF) sensor.
67. The system of any one of claims 62-66, wherein the acquired image data is transferred to a LAB color space.
68. The system of any one of claims 62-67, wherein the machine vision component analyzes the acquired image data using a feature descriptor.
69. The system of claim 68, wherein the machine vision component is trained to input feature descriptor data and to output cut-off values for multilevel thresholds for each pixel to generate the segmented image data from the acquired image data.
70. The system of claim 68 or 69, wherein the machine vision component is trained using labeled images and K-fold cross-validation.
71. The system of any of claims 68-70, wherein the machine vision component comprises an artificial neural network.
72. The system of any one of claims 68-71, wherein the feature descriptor is variance, entropy, kurtosis, energy, mean value, skewness, 2D superpixel, or any combination thereof.
73. The system of any one of claims 68-72, wherein the feature descriptor is selected by an automated algorithm.
74. The system of any one of claims 62-73, wherein the processor is configured to apply morphological image processing to the segmented image data.
75. The system of claim 74, wherein the morphological image processing outputs a boundary image data of the at least first anatomical structure of the plant.
76. The system of claim 75, wherein the morphological image processing outputs a skeleton image data of the at least first anatomical structure of the plant.
77. The system of claim 74, wherein the morphological image processing eliminates at least a second anatomical structure of the plant from the segmented image data.
78. The system of any one of claims 62-77, wherein the processor is configured to apply a point density variation to the segmented image data to determine the target point.
79. The system of any one of claims 62-77, wherein the processor is configured to apply a kernel density estimator to the segmented image data to determine the target point.
80. The system of any one of claims 62-79, wherein the processor is configured to validate the target point by calculating size and angle of a first principal orientation of a histogram gradient of a voxel of the segmented image data encompassing the target point.
81. The system of claim 80, wherein the processor is configured to normalize the first principal orientation of the histogram gradient of the voxel by matching and correlating with a predetermined second principal orientation of a histogram gradient of a ground voxel encompassing a predetermined suitable target point.
82. The system of any one of claims 62-81, wherein the processor is configured to project a coordinate map of the image sensor to a coordinate map of the robot.
83. The system of claim 82, wherein the controller receives real-time image data identifying the target point location, the controller receives real-time location data identifying a current location of the robot, and the controller generates and communicates the control signal to the robot to minimize a difference between the real-time location data and the target point location.
84. The system of any one of claims 62-83, further comprising an end-effector mounted to the robot.
85. The system of claim 84, wherein the end-effector is a clipping device for generating a clip to link a stem of the plant to a supporting stake, or the end-effector is a spraying device for localized spraying of the plant, or the end-effector is a cutting device for localized pruning of the plant.
86. The system of any one of claims 62-85, further comprising a sensor mounted to the robot.
87. The system of claim 86, wherein the sensor is the image sensor, or the sensor is a localization sensor, or the sensor is a contact-force sensor, or the sensor is a thermal sensor.
88. The system of any one of claims 62-87, wherein the target point is a clipping point for linking a stem of the plant to a supporting stake, or the target point is a spraying point for localized spraying of the plant, or the target point is a cutting point for localized pruning of the plant.
89. The system of any one of claims 62-87, wherein the plant is grown inside a greenhouse.
90. The system of any one of claims 62-87, wherein the plant is grown in an agricultural farming outdoor facility and is exposed to natural weather elements.
91. The system of any one of claims 62-90, wherein the at least first anatomical structure of the plant is a stem, leaf, branch, flower, bud, node, or petiole.
92. The system of any one of claims 62-91, wherein the at least first anatomical structure of the plant is a plurality of anatomical structures, and the acquired image data captures multiple nodes of a stem and leaves of the plant.
PCT/CA2024/000015 2023-11-30 2024-11-29 Devices, systems and methods for processing or propagating plants Pending WO2025111689A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363604596P 2023-11-30 2023-11-30
US63/604,596 2023-11-30

Publications (1)

Publication Number Publication Date
WO2025111689A1 true WO2025111689A1 (en) 2025-06-05

Family

ID=95895861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2024/000015 Pending WO2025111689A1 (en) 2023-11-30 2024-11-29 Devices, systems and methods for processing or propagating plants

Country Status (1)

Country Link
WO (1) WO2025111689A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5778946A (en) * 1995-09-12 1998-07-14 Pellenc (Societe Anonyme) Apparatus for placing ties, for example, for tying vines
US20080127787A1 (en) * 2005-01-18 2008-06-05 Jan Bosmans Device And A Method For Fitting An Elastic Element Around A Particularly Rectilinear Element

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5778946A (en) * 1995-09-12 1998-07-14 Pellenc (Societe Anonyme) Apparatus for placing ties, for example, for tying vines
US20080127787A1 (en) * 2005-01-18 2008-06-05 Jan Bosmans Device And A Method For Fitting An Elastic Element Around A Particularly Rectilinear Element

Similar Documents

Publication Publication Date Title
Miao et al. Efficient tomato harvesting robot based on image processing and deep learning
Arad et al. Development of a sweet pepper harvesting robot
Yoshida et al. Automated harvesting by a dual-arm fruit harvesting robot
CN109863874B (en) A fruit and vegetable picking method, picking device and storage medium based on machine vision
SepúLveda et al. Robotic aubergine harvesting using dual-arm manipulation
Silwal et al. Bumblebee: A Path Towards Fully Autonomous Robotic Vine Pruning.
Silwal et al. Design, integration, and field evaluation of a robotic apple harvester
Rajendran et al. Towards autonomous selective harvesting: A review of robot perception, robot design, motion planning and control
Majeed et al. Development and performance evaluation of a machine vision system and an integrated prototype for automated green shoot thinning in vineyards
Majeed et al. Estimating the trajectories of vine cordons in full foliage canopies for automated green shoot thinning in vineyards
Alaaudeen et al. Intelligent robotics harvesting system process for fruits grasping prediction
Pal et al. A novel end-to-end vision-based architecture for agricultural human–robot collaboration in fruit picking operations
Wang et al. Biologically inspired robotic perception-action for soft fruit harvesting in vertical growing environments
Sapkota et al. Yolo11 and vision transformers based 3d pose estimation of immature green fruits in commercial apple orchards for robotic thinning
CN119339249A (en) Automatic identification method and system for closed corridor harvesting
Chang et al. Design and implementation of an AI-based robotic arm for strawberry harvesting
Khan et al. Optimizing precision agriculture: A real-time detection approach for grape vineyard unhealthy leaves using deep learning improved YOLOv7 with feature extraction capabilities
Luo et al. DRL-enhanced 3D detection of occluded stems for robotic grape harvesting
Mahalingam et al. Containerized vertical farming using cobots
WO2025111689A1 (en) Devices, systems and methods for processing or propagating plants
Silwal Machine vision system for robotic apple harvesting in fruiting wall orchards
Krishnan et al. Revolutionizing agriculture: A comprehensive review of agribots, machine learning, and deep learning in meeting global food demands
Burks et al. Opportunity of robotics in precision horticulture.
Teng Research on Grapevine Recognition, Manipulation and Winter Pruning Automation
Udekwe et al. Virtual Reality-Enabled remote Human-Robot interaction for strawberry cultivation in greenhouses

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24895249

Country of ref document: EP

Kind code of ref document: A1