[go: up one dir, main page]

WO2018176025A1 - Système et procédé de conception de systèmes autonomes - Google Patents

Système et procédé de conception de systèmes autonomes Download PDF

Info

Publication number
WO2018176025A1
WO2018176025A1 PCT/US2018/024254 US2018024254W WO2018176025A1 WO 2018176025 A1 WO2018176025 A1 WO 2018176025A1 US 2018024254 W US2018024254 W US 2018024254W WO 2018176025 A1 WO2018176025 A1 WO 2018176025A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
data
physical
skill
representative
Prior art date
Application number
PCT/US2018/024254
Other languages
English (en)
Inventor
Richard Gary Mcdaniel
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2018176025A1 publication Critical patent/WO2018176025A1/fr

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41885Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by modeling, simulation of the manufacturing system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1661Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31264Control, autonomous self learn knowledge, rearrange task, reallocate resources
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/33Director till display
    • G05B2219/33054Control agent, an active logical entity that can control logical objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36017Graphic assisted robot programming, display projection of surface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39244Generic motion control operations, primitive skills each for special task
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40323Modeling robot environment for sensor based robot system
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/50Machine tool, machine tool null till machine tool work handling
    • G05B2219/50362Load unload with robot
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • This application relates to engineering autonomous systems.
  • this application relates to data objects for implementing reusable skills in autonomous systems.
  • a system for autonomous operation includes a world model of the system.
  • the world model includes a plurality of data objects representative of physical objects in the system, a plurality of data objects representative of environmental qualities of the system, a plurality of data objects representative of work objects that are acted upon by the physical objects, and a plurality of data objects representative of functional relationships between the physical objects and the objects that the physical objects act upon.
  • the system may further include an autonomous processor configured to receive information from the plurality of data objects representative of physical objects, representative of environmental qualities, representative of work objects and representative of functional relationships and perform reasoning operations to select a desired behavior of the system.
  • the autonomous processor changes at least one property of at least one data object responsive to the selected desired behavior.
  • Each data object may be representative of a physical aspect of the object it represents and representative of a logical aspect of the object it represents.
  • the data objects may be arranged in at least one cyber-physical production unit (CPPU).
  • a CPPU includes a plurality of data objects, the data objects being representative of a tightly related group of physical objects, work objects, or functional relationships.
  • a first CPPU may include a second CPPU nested within the first CPPU.
  • a single data object may be included in more than one CPPU. To avoid conflicts, if the data object is being used by the first CPPU, the data object is disabled from being used by the second CPPU.
  • the world model is designed based on the goals of the system, and not on the manner of how the goal is achieved.
  • Skills may be defined as data objects representative of a skill to perform one of the goals of the system.
  • Each skill data object may be referenced by more than one other data object representative of a physical object of the system.
  • an attachment mechanism for attaching a first data object to a second data object may be provided, wherein the first data object may be searched via the second data object.
  • a first visualization model for a first physical object of the system is used for a first algorithm of the physical object and a second visualization model is used for the first physical object of the system for a second algorithm of the physical object.
  • a system for performing autonomous engineering of an industrial system includes a plurality of data objects representing a plurality of physical objects, a plurality of data objects representing a plurality of relationships between one or more physical objects, and a plurality of data objects representing skills attributable to one or more physical objects, wherein each skill object defines a goal to be achieved by the industrial system and wherein an autonomous task is engineered by combining one or more data objects of the physical objects, relationships of the physical objects, and the skills attributable to the physical objects.
  • Each data object may represent a skill and receives one or more data objects representing a physical object, and a relationship between one or more physical objects as parameters of the skill object.
  • the system includes a control processor for the system.
  • the control processor receives the data objects as inputs and takes inputs from a plurality of data objects.
  • the control processor produces outputs that affect a behavior of at least one of the physical objects of the system.
  • the control processor may be operable in a programmable logic controller (PLC).
  • PLC programmable logic controller
  • a data object representing a physical object may be configured to have a data object representing a skill attached to it, wherein the skill represents an objective that the physical object can perform.
  • the data object representing a physical object may be attached to a plurality of data objects representing a plurality of skills.
  • a data object representing a skill may be attached to a plurality of data objects representing corresponding a plurality of physical objects.
  • a data object representing the skill identifies an objective that each of the plurality of physical objects may perform.
  • FIG. 1 is a block diagram of a machine tending system according to aspects of embodiments of the present disclosure.
  • FIG. 2 is a block diagram of an autonomous system using reusable skills according to aspects of embodiments of the present disclosure.
  • FIG. 3 is an illustration of a robot, identifying high-level goal-oriented language to specify an application function according to aspects of embodiments of the present disclosure.
  • FIG. 4 is a hierarchical diagram illustrating abstractions in application programming according to aspects of embodiments of the present disclosure.
  • FIG. 5 is an illustration of a machine tending use case according to aspects of embodiments of the present disclosure.
  • FIG. 6 is an illustration of a break down of the system shown in FIG. 5, indicating individual objects comprising the machine tending unit according to aspects of embodiments of the present disclosure.
  • FIG. 7 is an illustration of scope of an object, using two robot grippers as a unit to perform a task according to aspects of embodiments of the present disclosure.
  • FIG. 8a is a hierarchical representation of a unit with a strict relationship to its associated objects according to aspects of embodiments of the present disclosure.
  • FIG. 8b is a hierarchical representation of multiple units using different configurations of shared objects according to aspects of embodiments of the present disclosure.
  • FIG. 9 is an illustration of a simplified geometric visualization model for an object according to aspects of embodiments of the present disclosure.
  • FIG. 10 is an illustration of the use of a function block in a conveyor track system according to aspects of embodiments of the present disclosure.
  • FIG. 1 1 illustrates the use of a skill having a reference to another object according to aspects of embodiments of the present disclosure.
  • FIG. 12 illustrates the use of a stand-in for a world object for receiving a passed reference to a skill according to aspects of embodiments of the present disclosure.
  • FIG. 13 is an illustration of a self-reference to an object for applying a skill according to aspects of embodiments of the present disclosure.
  • FIG. 14 is an illustration of a machine tending application showing an excessive number of generic tag markers according to aspects of embodiments of the present disclosure.
  • FIG. 15 is an illustration of the machine tending application of FIG. 14, showing more compact markers that are representative of the semantic meaning of a skill according to aspects of embodiments of the present disclosure.
  • FIG. 16 is an illustration of different types of markers having different graphical representations according to aspects of embodiments of the present disclosure.
  • FIG. 17 is a block diagram of a computer system that may be used to implement certain aspects of embodiments of the present disclosure.
  • An autonomous system has three fundamental qualities. First, the devices and their controllers know more about their environment than traditional devices. Advanced sensors detect the presence of objects in a much more general fashion allowing devices to gain knowledge that is unprecedented in traditional production design. Second, production is managed using higher-level abstraction based on the product to be manufactured and less so about the exact method or even the machinery employed to perform the process operations. Third, the system should use its knowledge to recover from unexpected situations. The system should possess algorithms that allow the flow of control to handle interruptions and errors without requiring an engineer to anticipate all possible failure conditions.
  • Examples of some methods according to the present disclosure are applied to use cases to show their effectiveness.
  • One use case includes machine tending and packaging. During machine tending, a flexible device is used to automate another machine that would typically be tended by a human operator. For packaging, multiple items may be brought together for sorting into containers. The objects may then be wrapped or otherwise processed further. It will be shown that Engineering with Reusable Skills makes the job for the application engineer clear and efficient.
  • skills are used throughout this disclosure.
  • the meaning of the term skill may be distinguished from the use of this term in other contexts which previously assigned the term skill a wholly different meaning.
  • the skills are used to program machinery, but they are not intended to be used as a programming language per se.
  • skills are to programming as computer aided drafting (CAD) is to technical drawing.
  • CAD computer aided drafting
  • Skills form a higher-level abstraction or representation of what a machine can do and what the machine should be doing. In this way, skills are distinguished from behaviors, which generally describe functional semantics.
  • Skill - A skill defines the transformation of material into work products.
  • a skill is machine independent in the sense that the goal is to achieve the transformation and the method of achieving the goal is optional.
  • Behavior - A behavior is an activity that machines, work products, and other entities in the system perform.
  • a robot may move its joints, for example.
  • a behavior is machine specific and can be applied to a multitude of tasks.
  • the behaviors of the automation system are manifested in the objects of the world objects such as the devices performing the actions and the products being acted upon. Thus, behaviors are not really objects in themselves.
  • the devices and products are objects and the behaviors characterize what they do. In digital form, a behavior is represented by simulation of the device activity. In the physical world, the real device actually does the behavior.
  • the objects representing skills are intended to be used for high-level actions.
  • a skill might be "put an object at some location”.
  • the skill might be, "take two work projects and join them together in a given configuration”.
  • a skill is parameterized by using other objects in the world model such as a robot or the work pieces to be assembled.
  • a skill and its parameters form an interface to the outside that makes it convenient to apply and to apply skills consecutively to achieve the final product. It is amenable to search, simulation, and optimization.
  • the system may be directed in goal-specific terms. By supplying skills, tasks are described at a higher level than I/O values or device states.
  • the goal of the task may be represented by configurations and references to the world objects in the environment. How the actual work is performed need not be detailed provided the machine can determine how to do it for itself.
  • FIG. 1 is a block diagram showing a machine tending use case according to aspects of an exemplary embodiment.
  • the desire is to enable flexible and efficient configuration and programming of a mobile manipulator 101 that will be used with a CNC machine 103 for machine tending 105.
  • the main objective of this use case is to demonstrate that a system can be created that enables an application engineer to efficiently configure the locations of storage, supply, and production areas as well as the involved machines and their required skills.
  • it should be easy to configure the mobile manipulator 101 to tend multiple machines and the system as a whole should be able to manipulate and transport a variety of different materials 1 10 and products 120.
  • the machines should be derived making use of product lifecycle management (PLM) data (e.g., part models) for much of their content.
  • PLM product lifecycle management
  • ROS Robot Operating System
  • FIG. 2 is a block diagram of an exemplary architecture for an autonomous engineering system according to aspects of embodiments of the present disclosure.
  • An autonomous systems revolution (ASR) Controller 210 is shown separately from the machines 220, 221 because it holds the ASR Runtime 211.
  • the target machines 220, 221 could host the runtime but for the purposes of this illustration implementation will be limited to the system's multi-platform controllers.
  • the ASR Controller 210 could be any device capable of running the ASR Runtime 21 1 specifications, such as a Linux-based network router by way of example.
  • the runtime 21 1 contains application programming interfaces (APIs) for accessing the target machines 220, 221 that provide, for instance, TCP/IP or Digital I/O capability.
  • APIs application programming interfaces
  • a custom robot operating system (ROS) engine 230 may be implemented as part of a separate control path to the target machines 220, 221. It should be expected that the same machines could also be controlled by the runtime 21 1 (though not necessarily at the same time).
  • the ROS engine 230 defines an interface for the discovery and execution of skills and interacts with software for skill planning 240. This automatic planner will eventually switch to use world model objects 213 for skills and the execution engine 212 for executing planned processes.
  • Other major components include the engineering tool 230, runtime execution of engineering applications, and design of the programming model itself. The product of the engineering are the process objects 213. These process objects 213 are effectively the code that both specifies the world model as well as the logical semantics that govern the application's function. The process objects 213 are shown as being the same in both the engineering 230 where they are specified and in the runtime 21 1 where they are executed.
  • Component definitions 231 that define world model objects for the targeted process (e.g. machine models, work products, and skills) form the basis of this approach. These definitions 231 are loaded by the engineering server 232 at engineering time and by the execution engine 212 in the runtime 21 1.
  • the component definitions 231 provide the true implementation that connects the logical behavior of the application to the real machines 220, 221 that perform that behavior. During engineering, the machine behavior may be replaced with simulation.
  • An application engineer uses an engineering user interface 233 to create, view, and edit engineered processes and can rely on the engineering server 232 to simulate the behavior of engineered processes.
  • skill engineering in the industrial domain focuses on programming robotic devices. For instance, voice commands have been used to assemble simple robot behaviors by iteratively constructing tasks through a dialog between a human instructor and a robot.
  • voice commands have been used to assemble simple robot behaviors by iteratively constructing tasks through a dialog between a human instructor and a robot.
  • researchers distinguish between skill sequencing phase and a teaching phase. Sequencing is performed using a touch-based GUI on a tablet in which the operator selects skills from a skills library to create a skill sequence that matches the operations in the task at hand.
  • the operator parameterizes the generic skills in the skill sequence during the teaching phase.
  • Other research discusses kinesthetic teaching (piloting the robot arm to target locations by pushing the end-effector) and gesture-based teaching (based on a set of abstract manipulation skills with only a single parameter as input).
  • the programmer might want to pick up a set of objects 301 and sort them into bins 310.
  • the "program” for this application would consist of the robot 320, cameras 330, and any other active device. Plus, it would contain the passive objects like the work pieces 301 , jigs 303, and containers 310. Further, one would want the activity to be specified just as generically as denoted in the figure. It would be as close as possible to just "pick up these boxes 340" and "put them in this box” 350. The majority of the actual programming in this case would be determining the correct selection of objects to affect.
  • a traditional engineer would be a machine builder or an integrator and would be familiar with concepts like I/O and programmable logic controller (PLC) programming and how to program using basic logic such as Relay-Ladder-Logic (RLL).
  • PLC programmable logic controller
  • RLL Relay-Ladder-Logic
  • Programming with autonomous machines enables a new class of Application Engineer who may not know how to in program low-level languages or build circuits, but instead understands how to assemble a suite of machines to accomplish production tasks and knows what those machines can do to be able to orchestrate them to work together.
  • FIG. 4 A new hierarchy of programming semantics is shown in FIG. 4. As shown on the right, the top-most coding elements are intended to express the application semantics according to what it does for the production.
  • the "goals" of the production 401 are the operations and transformation that occur to the work pieces regardless of the device that carries it out. At the bottom are the devices themselves that are necessarily hardware-dependent 403. From an application perspective, the device could be used for many purposes and the goal could be achieved by a multitude of hardware.
  • the machine builder 431 has the role of building the machines and programming their behavior for example, through program code or libraries 434. Behavior is specified using instructions 432 just as one might program a machine traditionally. The machine building role is only modified in the sense that the devices and behaviors need to be made available to the application engineers 421 at the higher level. We also consider the system integrator 433 who would have the role to decide on interfacing factors 435 between the machine components and the skills. This would reflect what kinds of behaviors a machine should have and how one would access and control them.
  • the application engineer 421 has the role of building the skills that themselves can be constructed from internal skills 420. In principle, there is no difference between skills 410 and internal skills 420. It is simply the case that skills 410 may also be accessed externally such as from the MES system 41 1. Accordingly, the application engineer 421 may build skills that are constructed from skills 410 or internal skills 420. For example, the application engineer 421 may access higher level skills pertaining to an autonomous line 412 or autonomous cell 413. Internal skills 420 are used to implement other skills 422 but in themselves are just too implementation specific to be useful as a machine interface.
  • World Model Objects are the means for representation and interlocution between the functional elements of the application. They are also the essential means to define the behavior of the application.
  • the use of World Model Objects allows for the system to autonomously execute one or more skills 423 to execute the behaviors of the system. We will also introduce more kinds of objects into this world model in upcoming sections.
  • FIG. 5 An example use case is shown in FIG. 5.
  • a machine in this case a laser engraving device 501 , being tended by a mobile robot 503.
  • the robot in this example picks up a work piece 505 and brings it to the laser engraver 501 , the work piece 503 looks like a small gray box in the figure being held by the gripper 507.
  • the robot 503 takes the work piece 507 off of a conveyor (not shown), sets it into a jig 509 for transport, opens up the laser engraver's lid 51 1 , inserts the work piece 505 into the laser engraver 501 , and the process continues.
  • FIG. 6 A breakdown of the physical world objects being used in such a unit is shown in FIG. 6.
  • World objects are created to represent the physical as well as the semantic and functional aspects of the application. It is rare for a world object to be purely physical or logical.
  • a robot for example like the KMR 601 , is both a physical object and it moves around under programmatic control. This is further true of the robot transport KMP 603 and conveyor 607. This movement is likewise both physical and logical, the other parts of the unit's application may look to see the current pose of the robot 601 in order to decide the execution of its own behavior. Even a simple object, like the work piece 505 that looks like a little box is both physical and logical.
  • a work piece 505 is a logical element of the application, it is there so that the unit performs work and as such it acts like a switch.
  • the work piece's size can dictate the size of opening for the grasping device 507 and its angle and position determine how it can be moved.
  • Units may be nested in scope, for example laser engraver 501 may include the laser engraver's lid 51 1.
  • a handle 605 attached to the engraver lid 51 1 is part of the engraver lid 511 as well as the laser engraver machine 501.
  • a unit provides a set of services consistent with a tightly related group of machines.
  • a gripper attached to a robot is almost always considered part of the same unit.
  • whether or not two robots are part of the same unit depends on the task. If the two robots work together on the same work product, then they might best be considered part of the same unit. Two robots working independently might be better as separate units.
  • a camera system sensing the location of work parts may be shared across the entire shop floor. As such, it might be considered a unit unto itself or it may be considered a shared sub-unit across other processing units.
  • a robot 701 may be regularly used as an individual unit but for some special task, such as carrying a large object, it may recruit a second robot 703 to help left the heavy item.
  • the robot 701 running individually would form one unit when it is being apply to its usual tasks.
  • the second helper robot 703 would likewise be its own unit when working in that mode.
  • the two individual units may temporarily form a third 705, combined, unit. While the combined task is in progress, the individual units 701 , 703, though still present, would be inactive and have to wait until the combined unit 705 has completed before re-activating to start performing the individual tasks again.
  • a unit is also not one-to-one with controllers.
  • a single unit may be implemented in a distributed fashion over several controllers and multiple units might implemented within a single controller and any other combination.
  • a camera system may require its own processing unit to handle the computational load for recognizing objects.
  • Intelligent components like robots and conveyors will commonly use their own processors and controllers.
  • a unit is associated with the services it can contribute as designated by skills the unit exposes to the outside world. For example, one unit may provide a service that requires only one robot 701 to accomplish - like moving a workpiece from a conveyor to a holding bin. A separate unit may involve that same robot 701 and another robot 703 to perform a combined assembly skill such as the insertion of a work piece 707 into a container as shown in FIG. 7.
  • FIG. 8a and FIG. 8b are block diagrams showing the relationship and scope of one or more units designations according to embodiments of this disclosure. If one enforces a strict relationship where a given device can only belong to one unit, then it must be the case that any skill that combines the resources of multiple devices must require those devices to be part of the same unit. This kind of situation is shown in Figure 8a.
  • a single unit 801 is provided that includes three skills 803, 805 and 807.
  • the skills 803, 805, 807 may be associated with one or more physical objects 809 including robots and work pieces.
  • production systems are rarely created from green field and historically, devices might have been already separated across multiple units by a prior application engineer as shown in FIG. 8b.
  • Units 810, 820 and 830 are defined separately and contain skills 81 1 , 821 and 831 , respectively. This should not prohibit a new configuration of devices 809 to be used when it is recognized that a useful skill can be produced by that combination. As such it should be possible to seamlessly impose a new unit with new skills onto an existing set of devices without removing or re-writing the code of the original units. Having multiple units share devices is shown in Figure 8b.
  • a model can only include so much detail until it becomes unreasonable to be used for anything.
  • the purpose for the model needs to be understood and taken into account when designing world model objects to be used in engineering.
  • the first thing to recognize is that the purpose is to program production machines and not to simulate a machine or even to design a machine. What matters is the production itself and not the particular inner workings of the device. Such concerns can be left to other engineering tools.
  • PLM Product Lifecycle Management
  • PLM data consists of specialized formats generated by tools that are typically incompatible.
  • the PLM data for a machine may include its CAD geometry, its wiring diagram, and its PLC source code. For a human, examining the data, one may infer how the device works. For a machine, these are separate data sources with little overlap and the majority of its function is implied but not stated.
  • 3D geometry an obvious kind of PLM data that might be scavenged is 3D geometry. Having the correct 3D shape for a device is useful for display, programming, and simulation of the device.
  • tools for creating 3D CAD exist, but frequently, machine builders use 2D CAD. The goal of the machine builder is to create blueprints for their device that they can hand off to machinists and parts suppliers that actually build it. For this task, 2D CAD is well understood and a time-honored element of many manufacturers' toolchains.
  • 3D CAD uses geometric primitives (polygons a.k.a. triangles) to render even the tiniest details like air vents and screw holes. This level of detail causes a geometric shape that might be rendered in a few hundred polygons to become split into millions of polygons. Even assuming Moore's Law will somehow make graphics cards fast enough to render this much unnecessary complexity, the sheer inefficiency limits the size of problem that can be handled. Instead of drawing just one machine with millions of polygons, it is much more useful to be able to draw hundreds of machines with more efficiently chosen polycounts.
  • a picture of holes may be drawn in the same place.
  • This uses significantly fewer polygons and optimizes graphics processers to draw image maps quickly.
  • the idea is to find a polynomial function to represent the polygons in the surface. This is a kind of curve fitting operation. Once a curve is found, the number of polygons can be adjusted by creating a new rendering for the curve. A circle that was originally drawn with 512 segments can be reduced to a circle with 12 segments. This process has similar problems as mesh decimation with topology.
  • Labeling objects in the world object model may be addressed from more than one perspective.
  • attachment helps reduce the use of types of systems to encode data. Instead of creating a new subclass of an object to encode new data, simply attach the new data to an object via an appropriate attachment.
  • Attachments form a basic kind of reference that can be easily searched. For example, one does not need a special form to consider a manipulator class that includes a robot and a gripper. The robot alone can stand in for the top-level object and the gripper becomes "attached" to the robot. This also helps reduce the number of helper classes and allows for partial configurations to arise naturally. If the particular configuration of objects does not exist, one does not make the attachment. It is perfectly feasible that a device contains a robot with no gripper or that one has a gripper but no robot.
  • Position/Orientation/Shape is used as a first-class property.
  • a physical representation in its logical structure typically, concepts of position and shape are boiled down into simpler data structures.
  • the advantage of representing positions explicitly is obvious - it's where the robot's manipulator goes.
  • the concepts of size, shape, position, and orientation can be used in more general and even subtle ways. For example, consider a representation of a drill hole. One might commonly state that the drill skill has parameters of depth and diameter - numerical entities. An alternative would be to create a cylinder object of the proper size and position for the hole and use the shape as the drilling skill parameter.
  • a cylinder shape is explicitly parameterized in a known manner.
  • graphical semantic markers may be used as tags. The natural consequence of providing geometry as a direct programmable concept is that tagging becomes more obviously graphical and geometric.
  • the engineering objects that represent skills are effectively the same kinds of objects used to represent all other dynamic features in the engineering system.
  • an actuator in a robot arm that moves in a circular direction one can represent a service the robot performs, like picking up an object.
  • a service the robot performs like picking up an object.
  • the context for a skill object is different from an actuator because it has a different role.
  • an actuator might be considered “part of” the robot, a service like picking something up may be considered "attached to" the robot.
  • a behavior is what world objects do: a robot moves, a timer ticks, a work piece gets worked, etc.
  • a number of different ways to cause behavior to occur may be provided: property setting, method call, function block, and custom skills.
  • the diversity in methods reflects a desire to maintain a convenient interface for different kinds of functions and semantics.
  • a simple function with one parameter might be best reflected with a property.
  • a complicated function with many parameters and synchronous call semantics may best be implemented as a method call.
  • a gripper may have a "grasp" property whose value is a floating-point value in range 0.0 to 1.0. When set to 1.0, the gripper is fully open; when set to 0.0, the gripper is closed. Setting the value at run time changes the set point of the gripper. The gripper moves in response and when complete may toggle a stopped event using another property.
  • Another means to implement a behavior is with an object method.
  • World objects are defined by classes.
  • the class implementer normally the machine builder
  • Methods are useful for activities that can be easily expressed as a procedure call.
  • An example might be commanding a robot to move by setting the positions of all axes to a set of values. This might be considered more convenient than setting individual robot axis properties since it is implicit that all axis values are set at the same time.
  • a function block is an object-based technique to represent a function similar to what is done in PLC programming.
  • a function block has a state similar to an object, but it is also activated like a function.
  • a carrier 1001 in a smart conveyor track 1003 system is being controlled by a function block 1005 as its input.
  • the block 1005 appears as a circuit element 1007 whose input is tied to the device object, in this case the puck 1009.
  • Different kinds of function blocks would carry out different behaviors associated with the device.
  • a selection of three different functions 101 1 , 1013, 1015 are being applied to the puck 1009 as represented by the three rows of circuits being joined to form the input 1005 to the puck 1009.
  • a custom skill is just a way of noting that some kinds of functions are complex and ought to be coded using a lower-level language for efficiency. It is not that such a skill could not be specified as a combination of other skills at the application engineering level. It is just more convenient to hard-code the skill's behavior directly.
  • the engineering system supports this in the same way that the behavior of any world object is supported.
  • a skill is a kind of object in the engineering system just as a robot is a kind of object.
  • a skill can have behavior.
  • a simple skill can have a simple behavior, like setting the value of another object's property.
  • a complicated skill can have complicated behavior, like determining the best route in an intelligent conveyor system without causing collisions.
  • a composite skill is simply a skill that is formulated by assembling a set of other skills. From an implementation standpoint, all skills are composite in the sense that they cause behaviors of the various devices to come into play. Here, the discussion is more about the application engineer assembling new skills by combining the effects from other skills using engineering building blocks. The building blocks are based on common programming operations such as iteration, conditions, sequences, etc.
  • the semantics of a typical composite skill might be to execute a set of skills in sequence. This would be a sequence block or a sequence skill as you may be inclined to call it. The data encoded in the sequence is the ordered list of sub-skills that will be executed. The behavior of the sequence skill is to invoke each sub-skill in order one at a time. Other kinds of composition blocks, like if-then or loop, would have corresponding behavior implementations.
  • the programming structural blocks are only a part of the needed information to form a program.
  • the other needed elements are data representation and formulaic expressions.
  • the data representation is handled in the world model at least at the global level. We can also establish local variables in skill block composition. What remains is the expression language.
  • the expression language handles the basic data manipulation functions like operators: addition, multiplication, modulo, etc., as well as structural climbing functions finding parents and parts of objects, searching for objects with specific properties, fetching properties from object, etc.
  • the typical approach to implement an expression language is to provide a parser.
  • a Simula-like textual language with infix operators is standard.
  • Another common approach is a graphical constraint language like Simulink where operators are joined in a graphical node and line diagram representing dataflow. Dataflow diagrams tend to be easy to write but hard to read and understand. Textual languages tend to be easier to read because they are more compact, but they are much harder to write and can still be difficult to understand.
  • the parameters for skills appear as references to objects in the world model.
  • a reference to a particular known object would be a constant global reference (the object itself is not constant, the reference is fixed). Any other reference would be either a derived reference such as taking an object and finding one that is attached or otherwise linked to it or it would be a reference that is passed as a parameter.
  • the skill blocks require a mechanism to pass references from its own parameters to its sub-skills.
  • FIG. 1 1 shows a simple case, the application engineer at some point wants to close a gripper 1 101. The behavior of the gripper is provided by the "grip" property 1 103; setting it to zero closes the gripper. The application engineer creates a Set skill 1 105 for setting the gripper property. The Set skill 1 105 points directly to the gripper 1 107 and when it is invoked, the gripper 1 101 closes.
  • FIG. 12 for a passed reference, one instead uses a symbolic stand-in.
  • the stand-in represents a kind of world object and can be included in the expressions that do the work of the skill.
  • the symbolic stand-in is transformed into a reference to an actual world object and the expressions apply to it for real.
  • the symbolic stand-in might be a reference to a local variable where the result of an expression is stored.
  • a Set skill 1201 uses a symbolic stand-in 1203 identifying which gripper to close.
  • the expression 1203 in this case is a parameter of the enclosing skill 1201.
  • FIG. 13 is an illustration of a skill self-referencing itself.
  • the skill itself is a world object and can be used as a global reference to find other connections.
  • the skill might be attached to the robot that is intended to do the work.
  • the skill 1301 uses an expression that takes itself (like a "this" pointer) and reads its attached property to find the robot 1303.
  • the skill refers to itself as “self” and from there finds the gripper object 1305 to pass as a parameter to one of its sub-skills 1307.
  • the object grouping that constitutes a skill includes not only the code of the skill but also the set of all objects that are affected by the skill. If a skill sets a gripper's open position to 0.5 then the gripper is a member of the skill's set of objects.
  • a "Pick" skill is running on a first robot (Robotl ) to which Gripperl is attached. If a second robot is part of the same unit - say Robot2 with Gripper2 - then it should generally be okay to start a "Place” skill on Robot2 even though a Pick skill of Robotl is not yet complete.
  • FIG. 14 and FIG. 15 are illustrations of autonomous systems using a marking scheme for programming and control according to embodiments of the present disclosure.
  • a major drawback of using a marking scheme for programming is controlling how much detail is presented to the user.
  • a marked-up application is shown for controlling a mobile platform 1401 and robot 1403. This particular system has four different commands including packaging a set of work pieces, replenishing its stock of work pieces, and tending the laser cutter 1405.
  • the result becomes a messy clutter.
  • the markers 1510 are reduced to only those used to calibrate the robot's 1601 position.
  • the robot 1501 on first activation, drives over to the conveyor area 1503 being represented by a long gray box that is one of one or more one-meter tracks.
  • the robot 1501 then reaches over and touches the conveyor 1503 at three positions.
  • the positions shown with markers 1510 on the conveyor are the spots being touched, but also markers to show the robot approach vector and other attributes.
  • the wireframe object 1505 is being used to store the results of the calibration.
  • the diagram is still not especially readable because the markers are still overly generic and overlapping. It needs to be possible to make the markings more elegant and better representing the data being used.
  • the generic tag markers have been replaced with more compact markings 1601 that carry more of the semantic meaning of what the skill is intending to accomplish.
  • the compact markings 1601 are fully embedded in the 3D space and will move accordingly when the scene is rotated. While it is not the case that a user can fully comprehend the purpose of the markers 1601 just by looking at them, like in a diagram, accompanying text helps to guide the explanation.
  • the dots represent position where the robot gripper is sent to start a movement.
  • the square dot 1603 is the homing starting position
  • circle dots 1605 are the spots where the robot 1501 starts a touching pass.
  • the robot 1501 touches the conveyor 1503 three times starting from a circle dot 1605 and moving in the direction of the arrow.
  • the marker objects themselves can be made part of the skill itself.
  • a composite skill can contain sub-skills, for example, but it can also contain references to the markers used as parameters. In this way, a skill can also observe and set values in the markers to affect different activities.
  • a marker may be moved to follow the action in a camera, for example.
  • a calibration routine designed to cover four robot motions will have associated markings that describe where to carry out the motion.
  • the markers are actually members of the skill and they also are used as parameters to sub-skills such as performing the sweep motion to touch the conveyor. Unlike physical objects, the members of a skill object may be re-used in other skill objects.
  • the system is designed to enable an application engineer to efficiently configure a machine tending system. More specifically, the engineer configures the locations of storage, supply, and production areas and the required skills, and selects the involved machines (for example, a Kuka KMR iiwa mobile robot and a CNC cabinet in one case may be considered, a laser engraver may be considered in another).
  • the involved machines for example, a Kuka KMR iiwa mobile robot and a CNC cabinet in one case may be considered, a laser engraver may be considered in another.
  • this process is based on generic skills that are provided in the ecosystem and the integrator uses the Engineering Ul to engineer and compose these skills.
  • the engineered process can then be simulated inside the Engineering Ul and can be executed on the selected machines by invoking the appropriate device behaviors via the ASR Runtime - this is possible because the configured Process Objects are shared between the engineering environment and the runtime.
  • the mobile robot is instructed to locate the raw material, transport it to the laser cutter, and feed the laser cutter - this involves opening its door by using a handle.
  • the CNC machine is operated to produce the finished product.
  • the mobile robot retrieves the finished product from the machine and transports it to a storage location.
  • the entire execution process can be monitored by users using the Execution Ul.
  • the application engineer reconfigures the shop floor using the Engineering Ul, for instance by selecting a different CNC machine for further processing of the product, and the process starts anew.
  • the proposed system's ability to facilitate the engineering of industrial workflows that involve a gantry and computer vision system is demonstrated.
  • the system enables an application engineer to define what parts should be handled by the gantry and where they should be transported to, and to integrate the gantry with a computer vision system that locates the parts and obstacles to be avoided.
  • the application engineer uses the engineering Ul (FIG. 2) to engineer the desired process using generic skills.
  • the process can be simulated by the engineering Ul and can be executed on the actual hardware where the execution engine controls the machine agents that participate in the transport process.
  • the execution process can be monitored by users using the Execution Ul.
  • An architecture is proposed that is based on the generation, modification, and sharing of a World Model that captures the devices, material, work products, locations, skills, etc. that are relevant for an industrial process.
  • an application engineer uses the described tools to create a collection of Process Objects that comprise an abstract functional specification of the engineered process for its simulation and execution on actual hardware.
  • the execution of an engineered process depends on Component Definitions that link the process specification to the actual implementation of machine Behavior (for simulation, this behavior is replaced with simulated behavior via simulation-specific component definitions).
  • An Execution Engine that runs on the ASR Runtime controls the execution of Process Objects, and the execution process and current system state are visualized using an Execution Ul.
  • the described concepts provide a higher level of abstraction for application engineers to design industrial processes that depends on a library of reusable skills to simplify the programming and orchestration of industrial devices.
  • the main benefits of this approach is that it speeds up the (re)configuration of industrial setups and that solutions, once created, are generic enough to be deployed multiple times and on different types of hardware (e.g., pick-and-place processes are ubiquitous in industrial facilities).
  • pick-and-place processes are ubiquitous in industrial facilities.
  • the separation of the logical specification of an industrial process from its actual execution hardware will lead to libraries of these specifications that stay valid even as the hardware, its controllers, and their APIs change.
  • managing production systems on a higher abstraction level should enable them to recover from unexpected situations without requiring the engineer to anticipate all possible failure conditions.
  • FIG. 17 illustrates an exemplary computing environment 1700 within which embodiments of the invention may be implemented.
  • Computers and computing environments such as computer system 1710 and computing environment 1700, are known to those of skill in the art and thus are described briefly here.
  • the computer system 1710 may include a communication mechanism such as a system bus 1721 or other communication mechanism for communicating information within the computer system 1710.
  • the computer system 1710 further includes one or more processors 1720 coupled with the system bus 1721 for processing the information.
  • the processors 1720 may include one or more central processing units (CPUs), graphical processing units (GPUs), or any other processor known in the art. More generally, a processor as used herein is a device for executing machine-readable instructions stored on a computer readable medium, for performing tasks and may comprise any one or combination of, hardware and firmware. A processor may also comprise memory storing machine-readable instructions executable for performing tasks. A processor acts upon information by manipulating, analyzing, modifying, converting or transmitting information for use by an executable procedure or an information device, and/or by routing the information to an output device.
  • CPUs central processing units
  • GPUs graphical processing units
  • a processor may use or comprise the capabilities of a computer, controller or microprocessor, for example, and be conditioned using executable instructions to perform special purpose functions not performed by a general-purpose computer.
  • a processor may be coupled (electrically and/or as comprising executable components) with any other processor enabling interaction and/or communication there-between.
  • a user interface processor or generator is a known element comprising electronic circuitry or software or a combination of both for generating display images or portions thereof.
  • a user interface comprises one or more display images enabling user interaction with a processor or other device.
  • the computer system 1710 also includes a system memory 1730 coupled to the system bus 1721 for storing information and instructions to be executed by processors 1720.
  • the system memory 1730 may include computer readable storage media in the form of volatile and/or nonvolatile memory, such as read only memory (ROM) 1731 and/or random-access memory (RAM) 1732.
  • the RAM 1732 may include other dynamic storage device(s) (e.g., dynamic RAM, static RAM, and synchronous DRAM).
  • the ROM 1731 may include other static storage device(s) (e.g., programmable ROM, erasable PROM, and electrically erasable PROM).
  • system memory 1730 may be used for storing temporary variables or other intermediate information during the execution of instructions by the processors 1720.
  • a basic input/output system 1733 (BIOS) containing the basic routines that help to transfer information between elements within computer system 1710, such as during start-up, may be stored in the ROM 1731.
  • RAM 1732 may contain data and/or program modules that are immediately accessible to and/or presently being operated on by the processors 1720.
  • System memory 1730 may additionally include, for example, operating system 1734, application programs 1735, other program modules 1736 and program data 1737.
  • the computer system 1710 also includes a disk controller 1740 coupled to the system bus 1721 to control one or more storage devices for storing information and instructions, such as a magnetic hard disk 1741 and a removable media drive 1742 (e.g., floppy disk drive, compact disc drive, tape drive, and/or solid-state drive).
  • Storage devices may be added to the computer system 1710 using an appropriate device interface (e.g., a small computer system interface (SCSI), integrated device electronics (IDE), Universal Serial Bus (USB), or FireWire).
  • SCSI small computer system interface
  • IDE integrated device electronics
  • USB Universal Serial Bus
  • FireWire FireWire
  • the computer system 1710 may also include a display controller 1765 coupled to the system bus 1721 to control a display or monitor 1766, such as a cathode ray tube (CRT) or liquid crystal display (LCD), for displaying information to a computer user.
  • the computer system includes an input interface 1760 and one or more input devices, such as a keyboard 1762 and a pointing device 1761 , for interacting with a computer user and providing information to the processors 1720.
  • the pointing device 1761 for example, may be a mouse, a light pen, a trackball, or a pointing stick for communicating direction information and command selections to the processors 1720 and for controlling cursor movement on the display 1766.
  • the display 1766 may provide a touch screen interface which allows input to supplement or replace the communication of direction information and command selections by the pointing device 1761.
  • an augmented reality device 1767 that is wearable by a user, may provide input/output functionality allowing a user to interact with both a physical and virtual world.
  • the augmented reality device 1767 is in communication with the display controller 1765 and the user input interface 1760 allowing a user to interact with virtual items generated in the augmented reality device 1767 by the display controller 1765.
  • the user may also provide gestures that are detected by the augmented reality device 1767 and transmitted to the user input interface 1760 as input signals.
  • the computer system 1710 may perform a portion or all of the processing steps of embodiments of the invention in response to the processors 1720 executing one or more sequences of one or more instructions contained in a memory, such as the system memory 1730.
  • a memory such as the system memory 1730.
  • Such instructions may be read into the system memory 1730 from another computer readable medium, such as a magnetic hard disk 1741 or a removable media drive 1742.
  • the magnetic hard disk 1741 may contain one or more datastores and data files used by embodiments of the present invention. Datastore contents and data files may be encrypted to improve security.
  • the processors 1720 may also be employed in a multi-processing arrangement to execute the one or more sequences of instructions contained in system memory 1730.
  • hard-wired circuitry may be used in place of or in combination with software instructions. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
  • the computer system 1710 may include at least one computer readable medium or memory for holding instructions programmed according to embodiments of the invention and for containing data structures, tables, records, or other data described herein.
  • the term "computer readable medium” as used herein refers to any medium that participates in providing instructions to the processors 1720 for execution.
  • a computer readable medium may take many forms including, but not limited to, non-transitory, non-volatile media, volatile media, and transmission media.
  • Non-limiting examples of non-volatile media include optical disks, solid state drives, magnetic disks, and magneto-optical disks, such as magnetic hard disk 1741 or removable media drive 1742.
  • Non-limiting examples of volatile media include dynamic memory, such as system memory 1730.
  • Non-limiting examples of transmission media include coaxial cables, copper wire, and fiber optics, including the wires that make up the system bus 1721.
  • Transmission media may also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
  • the computing environment 1700 may further include the computer system 1710 operating in a networked environment using logical connections to one or more remote computers, such as remote computing device 1780.
  • Remote computing device 1780 may be a personal computer (laptop or desktop), a mobile device, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer system 1710.
  • computer system 1710 may include modem 1772 for establishing communications over a network 1771 , such as the Internet. Modem 1772 may be connected to system bus 1721 via user network interface 1770, or via another appropriate mechanism.
  • Network 1771 may be any network or system generally known in the art, including the Internet, an intranet, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a direct connection or series of connections, a cellular telephone network, or any other network or medium capable of facilitating communication between computer system 1710 and other computers (e.g., remote computing device 1780).
  • the network 1771 may be wired, wireless or a combination thereof. Wired connections may be implemented using Ethernet, Universal Serial Bus (USB), RJ-6, or any other wired connection generally known in the art.
  • Wireless connections may be implemented using Wi-Fi, WiMAX, and Bluetooth, infrared, cellular networks, satellite or any other wireless connection methodology generally known in the art. Additionally, several networks may work alone or in communication with each other to facilitate communication in the network 1771.
  • An executable application comprises code or machine- readable instructions for conditioning the processor to implement predetermined functions, such as those of an operating system, a context data acquisition system or other information processing system, for example, in response to user command or input.
  • An executable procedure is a segment of code or machine-readable instruction, sub-routine, or other distinct section of code or portion of an executable application for performing one or more particular processes. These processes may include receiving input data and/or parameters, performing operations on received input data and/or performing functions in response to received input parameters, and providing resulting output data and/or parameters.
  • a graphical user interface comprises one or more display images, generated by a display processor and enabling user interaction with a processor or other device and associated data acquisition and processing functions.
  • the GUI also includes an executable procedure or executable application.
  • the executable procedure or executable application conditions the display processor to generate signals representing the GUI display images. These signals are supplied to a display device which displays the image for viewing by the user.
  • the processor under control of an executable procedure or executable application, manipulates the GUI display images in response to signals received from the input devices. In this way, the user may interact with the display image using the input devices, enabling user interaction with the processor or other device.
  • the functions and process steps herein may be performed automatically or wholly or partially in response to user command.
  • An activity (including a step) performed automatically is performed in response to one or more executable instructions or device operation without user direct initiation of the activity.

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Stored Programmes (AREA)

Abstract

L'invention concerne un système à fonctionnement autonome comprenant un modèle mondial du système. Le modèle mondial comprend une pluralité d'objets de données représentatifs d'objets physiques dans le système, une pluralité d'objets de données représentatifs de qualités environnementales du système, une pluralité d'objets de données représentatifs d'objets de travail qui sont sollicités par les objets physiques et une pluralité d'objets de données représentatifs de relations fonctionnelles entre les objets physiques et les objets sur lesquels les objets physiques agissent. Le système peut en outre comprendre un processeur autonome configuré pour recevoir des informations de la pluralité d'objets de données représentatifs d'objets physiques, représentatifs de qualités environnementales, représentatifs d'objets de travail et représentatifs de relations fonctionnelles, et effectuer des opérations de raisonnement pour sélectionner un comportement souhaité du système. Le processeur autonome modifie au moins une propriété d'au moins un objet de données en réponse au comportement souhaité sélectionné.
PCT/US2018/024254 2017-03-24 2018-03-26 Système et procédé de conception de systèmes autonomes WO2018176025A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762476048P 2017-03-24 2017-03-24
US62/476,048 2017-03-24

Publications (1)

Publication Number Publication Date
WO2018176025A1 true WO2018176025A1 (fr) 2018-09-27

Family

ID=62063161

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/024254 WO2018176025A1 (fr) 2017-03-24 2018-03-26 Système et procédé de conception de systèmes autonomes

Country Status (1)

Country Link
WO (1) WO2018176025A1 (fr)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020106706A1 (fr) * 2018-11-19 2020-05-28 Siemens Aktiengesellschaft Marquage d'objet en soutien de tâches réalisées par des machines autonomes
EP3760392A1 (fr) * 2019-07-03 2021-01-06 Günther Battenberg Procédé et système d'essai et/ou de montage d'un objet au moyen d'un robot
WO2021129922A1 (fr) * 2019-12-23 2021-07-01 Siemens Aktiengesellschaft Procédé pour faire fonctionner un système d'automatisation et système d'automatisation
US11167420B2 (en) * 2018-02-06 2021-11-09 Tata Consultancy Services Limited Systems and methods for auto-generating a control and monitoring solution for smart and robotics environments
WO2021230876A1 (fr) * 2020-05-15 2021-11-18 Siemens Aktiengesellschaft Contrôleur logique de compétences (slc)
US20220048191A1 (en) * 2020-08-12 2022-02-17 General Electric Company Robotic activity decomposition
CN115335195A (zh) * 2020-03-27 2022-11-11 Abb瑞士股份有限公司 用于对机器人进行编程的方法和系统
WO2023028881A1 (fr) 2021-08-31 2023-03-09 Siemens Aktiengesellschaft Système, procédé et support de stockage pour commande automatique de système de production
US12056476B2 (en) * 2019-11-12 2024-08-06 Bright Machines, Inc. Software defined manufacturing/assembly system
US12204314B2 (en) 2020-11-10 2025-01-21 Bright Machines, Inc. Method and apparatus for improved auto-calibration of a robotic cell
US12380587B2 (en) 2021-07-16 2025-08-05 Bright Machines, Inc. Method and apparatus for vision-based tool localization
WO2025183987A1 (fr) 2024-02-29 2025-09-04 Siemens Corporation Systèmes et procédés de résolution de contrainte interactive pour placement robotique à l'aide de marqueurs sémantiques
US12420408B1 (en) * 2020-07-17 2025-09-23 Bright Machines, Inc. Human machine interface recipe building system for a robotic manufacturing system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150239127A1 (en) * 2014-02-25 2015-08-27 Gm Global Technology Operations Llc. Visual debugging of robotic tasks
WO2016074730A1 (fr) * 2014-11-13 2016-05-19 Siemens Aktiengesellschaft Procédé de planification pour la fabrication d'un produit et module de production avec information d'autodescription

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150239127A1 (en) * 2014-02-25 2015-08-27 Gm Global Technology Operations Llc. Visual debugging of robotic tasks
WO2016074730A1 (fr) * 2014-11-13 2016-05-19 Siemens Aktiengesellschaft Procédé de planification pour la fabrication d'un produit et module de production avec information d'autodescription

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BOADA M J L ET AL: "Visual approach skill for a mobile robot using learning and fusion of simple skills", ROBOTICS AND AUTONOMOUS SYST, ELSEVIER SCIENCE PUBLISHERS, AMSTERDAM, NL, vol. 38, no. 3-4, 31 March 2002 (2002-03-31), pages 157 - 170, XP004344527, ISSN: 0921-8890, DOI: 10.1016/S0921-8890(02)00165-3 *
DURR M ET AL: "USING CONVENTIONAL AND NESTED RELATIONAL DATABASE SYSTEMS FOR MODELLING CIM DATA", COMPUTER AIDED DESIGN, ELSEVIER PUBLISHERS BV., BARKING, GB, vol. 21, no. 6, 1 July 1989 (1989-07-01), pages 379 - 392, XP000095159, ISSN: 0010-4485, DOI: 10.1016/0010-4485(89)90005-5 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11167420B2 (en) * 2018-02-06 2021-11-09 Tata Consultancy Services Limited Systems and methods for auto-generating a control and monitoring solution for smart and robotics environments
US11951631B2 (en) 2018-11-19 2024-04-09 Siemens Aktiengesellschaft Object marking to support tasks by autonomous machines
CN113039499A (zh) * 2018-11-19 2021-06-25 西门子股份公司 通过自主机器支持任务的对象标记
WO2020106706A1 (fr) * 2018-11-19 2020-05-28 Siemens Aktiengesellschaft Marquage d'objet en soutien de tâches réalisées par des machines autonomes
CN113039499B (zh) * 2018-11-19 2024-06-18 西门子股份公司 通过自主机器支持任务的对象标记
EP3760392A1 (fr) * 2019-07-03 2021-01-06 Günther Battenberg Procédé et système d'essai et/ou de montage d'un objet au moyen d'un robot
US12056476B2 (en) * 2019-11-12 2024-08-06 Bright Machines, Inc. Software defined manufacturing/assembly system
WO2021129922A1 (fr) * 2019-12-23 2021-07-01 Siemens Aktiengesellschaft Procédé pour faire fonctionner un système d'automatisation et système d'automatisation
CN115335195A (zh) * 2020-03-27 2022-11-11 Abb瑞士股份有限公司 用于对机器人进行编程的方法和系统
WO2021230876A1 (fr) * 2020-05-15 2021-11-18 Siemens Aktiengesellschaft Contrôleur logique de compétences (slc)
US12420408B1 (en) * 2020-07-17 2025-09-23 Bright Machines, Inc. Human machine interface recipe building system for a robotic manufacturing system
US11654566B2 (en) * 2020-08-12 2023-05-23 General Electric Company Robotic activity decomposition
US20220048191A1 (en) * 2020-08-12 2022-02-17 General Electric Company Robotic activity decomposition
US12204314B2 (en) 2020-11-10 2025-01-21 Bright Machines, Inc. Method and apparatus for improved auto-calibration of a robotic cell
US12380587B2 (en) 2021-07-16 2025-08-05 Bright Machines, Inc. Method and apparatus for vision-based tool localization
WO2023028881A1 (fr) 2021-08-31 2023-03-09 Siemens Aktiengesellschaft Système, procédé et support de stockage pour commande automatique de système de production
EP4367564A4 (fr) * 2021-08-31 2024-12-11 Siemens Aktiengesellschaft Système, procédé et support de stockage pour commande automatique de système de production
WO2025183987A1 (fr) 2024-02-29 2025-09-04 Siemens Corporation Systèmes et procédés de résolution de contrainte interactive pour placement robotique à l'aide de marqueurs sémantiques

Similar Documents

Publication Publication Date Title
US11951631B2 (en) Object marking to support tasks by autonomous machines
WO2018176025A1 (fr) Système et procédé de conception de systèmes autonomes
Guerra-Zubiaga et al. An approach to develop a digital twin for industry 4.0 systems: manufacturing automation case studies
US10807237B2 (en) System and method for flexible human-machine collaboration
Kokkas et al. An Augmented Reality approach to factory layout design embedding operation simulation
CN112783018A (zh) 工业环境模拟下的机器人数字孪生控制
US12011835B2 (en) Engineering autonomous systems with reusable skills
Merdan et al. Knowledge-based cyber-physical systems for assembly automation
KR20230111250A (ko) 로봇 제어 계획들의 생성
Niermann et al. Software framework concept with visual programming and digital twin for intuitive process creation with multiple robotic systems
Park et al. Hardware-in-the-loop simulation for a production system
Mohammed et al. Leveraging model based definition and STEP AP242 in task specification for robotic assembly
Stenmark et al. From high-level task descriptions to executable robot code
US12115670B2 (en) Equipment specific motion plan generation for robotic skill adaptation
Lindorfer et al. Towards user-oriented programming of skill-based automation systems using a domain-specific meta-modeling approach
De Silva et al. Synthesising process controllers from formal models of transformable assembly systems
EP3974928B1 (fr) Gestionnaire de schéma de câblage et émulateur
Ding et al. Intuitive Instruction of Robot Systems: Semantic Integration of Standardized Skill Interfaces
Makris An Approach for Validating the Behavior of Autonomous Robots in a Virtual Environment
Khalil et al. Isaac Sim Integrated Digital Twin for Feasibility Checks in Skill-Based Engineering
Gross Computer Vision Based Object Manipulation in Gloveboxes Using a Robot Agnostic Controller and Digital Twin
Malkarov et al. Investigation of a Simulated Robot Model when it is Trained to Set Trajectories in the Task of Controlling Assembly Parameters at Remote Visualization
Kłodkowski et al. Simulating human motion using Motion Model Units–example implementation and usage
Sander Digital Twins for Flexible Manufacturing
Yadgarova Hybrid cloud environment for manufacturing control system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18720423

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18720423

Country of ref document: EP

Kind code of ref document: A1