CN110955421A - Method, system, electronic device, storage medium for robot programming - Google Patents
Method, system, electronic device, storage medium for robot programming Download PDFInfo
- Publication number
- CN110955421A CN110955421A CN201911162546.6A CN201911162546A CN110955421A CN 110955421 A CN110955421 A CN 110955421A CN 201911162546 A CN201911162546 A CN 201911162546A CN 110955421 A CN110955421 A CN 110955421A
- Authority
- CN
- China
- Prior art keywords
- action
- sequence
- action block
- blocks
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/30—Creation or generation of source code
- G06F8/34—Graphical or visual programming
Landscapes
- Engineering & Computer Science (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Manipulator (AREA)
Abstract
The invention provides a method, a system, an electronic device and a storage medium for robot programming, wherein the method for robot programming comprises the following steps: providing a robotic programming interface, the robotic programming interface including at least a first sequence editing region; and receiving an operation of a user in the first sequence editing area to generate and display a first action block sequence, wherein the first action block sequence comprises a plurality of action blocks arranged according to a time axis sequence, each action block determines an action type and an action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the first action block sequence is used for the robot to execute the program code segment and the action parameter mapped by the action type of each action block according to the time axis sequence. The method and the system provided by the invention can realize an intuitive, simple, clear and clear interaction mode of robot programming.
Description
Technical Field
The present invention relates to the field of computer application technologies, and in particular, to a method, a system, an electronic device, and a storage medium for robot programming.
Background
With the development of robotics, robots have been gradually popularized from the industrial field, and developed for personal and home use. However, the programming of robots at present often involves programming languages, which present difficulties for both personal and home users. Meanwhile, even in the case of robot programming in an industrial setting, a person familiar with the robot programming language is still required to perform programming work, which also greatly increases the labor cost of an enterprise.
Therefore, how to enable a user to simply complete the programming of the whole robot through an intuitive, simple, clear and clear interactive mode without the need of the user to understand unsmooth and difficult professional terms and programming languages in the robot is a technical problem to be solved in the field.
Disclosure of Invention
The present invention is directed to overcoming the above-mentioned problems associated with the related art, and providing a method, system, electronic device, and storage medium for robot programming that substantially obviates one or more of the problems due to limitations and disadvantages of the related art.
According to one aspect of the invention, there is provided a method for robot programming, comprising:
providing a robotic programming interface, the robotic programming interface including at least a first sequence editing region; and
receiving an operation of a user in the first sequence editing area to generate and display a first action block sequence, wherein the first action block sequence comprises a plurality of action blocks arranged according to a time axis sequence, each action block determines an action type and an action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the first action block sequence is used for the robot to execute the program code segment and the action parameter mapped by the action type of each action block according to the time axis sequence.
In some embodiments of the invention, the robotic programming interface further comprises a second sequence editing area, the providing the robotic programming interface further comprising:
receiving an operation of a user in the second sequence editing area to generate and display a second action block sequence, wherein the second action block sequence comprises a plurality of action blocks arranged according to a time axis sequence, each action block determines an action type and an action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the second action block sequence is used for the peripheral equipment to execute the program code segment and the action parameter mapped by the action type of each action block according to the time axis sequence,
wherein the first sequence of action blocks interacts with the second sequence of action blocks.
In some embodiments of the invention, the first sequence of action blocks interacting with the second sequence of action blocks comprises:
performing interaction of action parameters according to the action blocks in the first action block sequence and the second action block sequence, so that at least part of action blocks in the first action block sequence are triggered by at least part of action blocks in the second action block sequence; or at least part of the action blocks in the second action block sequence are triggered by at least part of the action blocks in the first action block sequence.
In some embodiments of the invention, said providing a robot programming interface further comprises:
receiving a packaging operation of a user on at least one action block in the first action block sequence or the second action block sequence, so as to package the selected at least one action block into an action package, wherein a plurality of action blocks in the action package are simultaneously operated by the user.
In some embodiments of the present invention, the action package has two display modes of a collapse display and an expansion display, the action package has the same display size as the action block when in the collapse display, and the action package has a different display color and/or brightness than the action block.
In some embodiments of the present invention, the receiving a packaging operation of a user for at least one motion block in the first motion block sequence or the second motion block sequence to package the selected at least one motion block into a motion package further includes:
and receiving the setting of the user on the execution parameters of the action packet, wherein the execution parameters comprise the cycle times of cycle execution.
In some embodiments of the invention, said providing a robot programming interface further comprises:
and receiving dragging of the action blocks of the user in the first sequence editing area and/or the second sequence editing area so as to change the sequence of the action blocks in the time axis or move the action blocks to another sequence editing area.
In some embodiments of the invention, said providing the robot programming interface comprises:
receiving an adding operation of a user in the first sequence editing area and/or the second sequence editing area so as to add an action block in the first sequence editing area and/or the second sequence editing area;
receiving the operation of the user on the added action block, and displaying an action type list on the added action block; and
and receiving the operation of the user on the action type list, and determining the action type of the added action block.
In some embodiments of the invention, the action types include at least location, semaphore, digital IO, analog IO, message prompt, wait, load configuration, gripper, user coordinate action.
In some embodiments of the invention, said providing a robot programming interface further comprises:
receiving an editing operation of a user on an action block, wherein the action block is displayed in an editing state, and a plurality of editing items are displayed on the edge of an action block display area of the action block in the editing state;
receiving the operation of the user on the editing item, and displaying an editing page of the editing item on one side of the editing item; and
and receiving the operation of the user on the editing page, and editing the action parameters of the action block.
In some embodiments of the present invention, when the action type is a position, the edit item at least includes a speed and an acceleration time, the edit page displays a coordinate axis, an abscissa of the coordinate axis is the acceleration time, and an ordinate of the coordinate axis is the speed, and the receiving the operation of the user on the edit item includes: and receiving the operation of the user on the icon on the abscissa and the icon on the ordinate of the coordinate axis so as to edit the speed and the acceleration time.
According to yet another aspect of the invention, there is also provided an apparatus for robot programming, comprising:
an interface providing module for providing a robot programming interface, wherein the robot programming interface at least comprises a first sequence editing area; and
the receiving module is used for receiving the operation of a user in the first sequence editing area to generate and display a first action block sequence, the first action block sequence comprises a plurality of action blocks which are arranged according to a time axis sequence, each action block determines the action type and the action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the first action block sequence is used for the robot to execute the program code segment and the action parameter mapped by the action type of each action block according to the time axis sequence.
According to yet another aspect of the invention, there is also provided a system for robot programming, comprising:
means for robotic programming as described above; and
a robot.
According to still another aspect of the present invention, there is also provided an electronic apparatus, including: a processor; a storage medium having stored thereon a computer program which, when executed by the processor, performs the steps as described above; the electronic device further comprises one or more of the following: an Ethernet network card; a wireless network card; an IO controller; a communication circuit.
According to yet another aspect of the present invention, there is also provided a storage medium having stored thereon a computer program which, when executed by a processor, performs the steps as described above.
Compared with the prior art, the invention has the advantages that:
by utilizing the action blocks and the action block sequence sequenced according to the time axis, a user can simply complete the programming of the whole robot through visual, simple, clear and clear visual and operation interaction modes, and does not need to understand unsmooth professional terms and programming languages in the robot, so that zero-code basic programming is really realized, and on one hand, the use of the robot programming of personal and family users is facilitated; on the other hand, the enterprise human cost is reduced.
Drawings
The above and other features and advantages of the present invention will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings.
Fig. 1 shows a flow chart of a method for robot programming according to an embodiment of the invention.
FIG. 2 illustrates a schematic diagram of a robot programming interface in accordance with a particular embodiment of the present invention.
FIG. 3 is a diagram illustrating an add action block according to a specific embodiment of the present invention.
Fig. 4 is a diagram illustrating an action block in a non-editing state where an action type is a location according to an embodiment of the present invention.
Fig. 5 is a diagram illustrating an action block editing state in which an action type is a location according to an embodiment of the present invention.
FIG. 6 is a diagram illustrating an expanded view of an edit item in the action block edit state with a location as an action type, according to an embodiment of the invention.
FIG. 7 is a diagram illustrating another expanded edit item in the action block edit state with the action type of location, according to an embodiment of the invention.
FIG. 8 shows a schematic diagram of an action bag in an expanded state, according to a specific embodiment of the invention.
Fig. 9 is a schematic diagram illustrating a state in which an action package is collapsed according to a specific embodiment of the present invention.
Fig. 10 shows a schematic diagram of a first sequence of action blocks and a second sequence of action blocks according to a specific embodiment of the invention.
FIG. 11 illustrates a schematic diagram of a fine tuning interface in accordance with a specific embodiment of the present invention.
FIG. 12 illustrates a block diagram of an apparatus for robot programming in accordance with an embodiment of the present invention.
FIG. 13 illustrates a block diagram of a system for robotic programming in accordance with a particular embodiment of the present invention.
Fig. 14 schematically illustrates a computer-readable storage medium in an exemplary embodiment of the invention.
Fig. 15 schematically illustrates an electronic device in an exemplary embodiment of the invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the invention and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the steps. For example, some steps may be decomposed, and some steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
Fig. 1 shows a flow chart of a method for robot programming according to an embodiment of the invention. With reference to fig. 1, the method for robot programming comprises the following steps:
step S110: providing a robotic programming interface, the robotic programming interface including at least a first sequence editing region; and
step S120: receiving an operation of a user in the first sequence editing area to generate and display a first action block sequence, wherein the first action block sequence comprises a plurality of action blocks arranged according to a time axis sequence, each action block determines an action type and an action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the first action block sequence is used for the robot to execute the program code segment and the action parameter mapped by the action type of each action block according to the time axis sequence.
Therefore, in the method for programming the robot, provided by the invention, by utilizing the action blocks and the action block sequence sequenced according to the time axis, a user can simply complete the programming of the whole robot in an intuitive, simple, clear and clear visual and operation interaction mode without understanding unsmooth professional terms and programming languages in the robot, and on one hand, the method is convenient for personal and family users to use the robot programming; on the other hand, the enterprise human cost is reduced.
A method for robot programming according to an embodiment of the invention is described below with reference to fig. 2 to 10.
Referring first to fig. 2, fig. 2 shows a schematic view of a robot programming interface according to a specific embodiment of the invention.
In the embodiment shown in FIG. 2, the robotic programming interface 200 includes a first sequence editing region 210 and a second sequence editing region 220. Each sequence editing area is used to edit a different sequence of action blocks. Specifically, the first sequence of action blocks edited in the first sequence editing region 210 is for execution by the robot. The second sequence of action blocks edited in the second sequence editing region 220 is for execution by the peripheral device. Fig. 2 is a schematic diagram of a robot programming interface 200 provided by the present invention, and the present invention is not limited thereto, and the present invention may also provide a third sequence editing area, a fourth sequence editing area, etc. for editing the action block sequence executed by other devices besides the robot, and these variations are within the scope of the present invention.
In some embodiments of the present invention, step S110 in fig. 1 may be followed by the following steps: an adding operation of the user in the first sequence editing area 210 and/or the second sequence editing area 220 is received, so that an action block is added in the first sequence editing area 210 and/or the second sequence editing area 220.
Specifically, for example, an action block is added to the first sequence editing area 210 by clicking, touching, long-pressing, pressure-pressing, or the like on the "+" sign icon in the first sequence editing area 210 of fig. 2. Similarly, an action block is added to the second sequence editing area 220 by clicking, touching, long pressing, pressure pressing, etc. on the "+" sign icon in the second sequence editing area 220 of fig. 2. The "+" icon is merely illustrative of one way to add action blocks and the invention is not so limited. For example, variations in the location of other textual icons, other symbolic icons, or icons for indicating additions within and outside the editing area are within the scope of the present invention.
After adding action blocks, reference may be made to fig. 3. fig. 3 shows a schematic diagram of adding action blocks according to a specific embodiment of the invention. An action block 230 is first displayed in the added edit area (in this embodiment, the first sequence edit area). In the preferred embodiment of the present invention, the added action block is a default action block, and the default action block corresponds to a default action type and default action parameters. By receiving an operation of the added action block 230 by the user, an action type list 231 is displayed in the added action block 230. Specifically, an icon of the action type of the action block 230 is displayed in the display area of the action block 230, and an action type list 231 is displayed by left-clicking, right-clicking, touch-sensing, long-pressing, pressure-pressing, or the like of the icon. Fig. 3 is merely an icon schematically illustrating a plurality of action types included in the action type list 231. For example, a indicates a position, B indicates a paw motion, C indicates a wait, D indicates a warning signal, E indicates a digital signal, F indicates an analog signal, G indicates a semaphore, and H indicates a load configuration. The digital signal, the analog signal, and the semaphore may also include setting, reading, waiting, and other types of actions, which are not described herein again. Variations in the display form of the action types (icons, text, combinations of icons and text), the specific content of the action types (e.g., location, semaphore, digital IO, analog IO, message prompt, wait, load configuration, paw action, user coordinate system), and the number of action types are within the scope of the invention. Then, an operation of the action type list 231 by the user is received, and the action type of the added action block 231 is determined.
The following describes editing operations performed on an action block by taking the action type as an example of the action block at the position. Referring first to fig. 4, fig. 4 is a diagram illustrating an action block in a non-editing state where an action type is a location according to an embodiment of the present invention. The action block 230A in the non-editing state displays only the icon and description information of the action type of the action block. Referring next to fig. 5, fig. 5 is a diagram illustrating an action block editing state in which an action type is a location according to a specific embodiment of the present invention.
Specifically, the following steps are also included after step S110 in fig. 1: an editing operation of a user on an action block 230A (fig. 4) displayed in an editing state is received, and the action block 230B (fig. 5) in the editing state displays a plurality of editing items 230B1 at the edge of an action block display area. The edit item may include, for example, an action package name, a number of loops, a copy, a delete, etc. The user's editing operation on the action block 230A may be, for example, an operation different from the aforementioned operation of displaying an action type list 231. For example, an action type list 231 is displayed by a touch operation within a pressure threshold/within a time threshold of the action block 230A, and an editing state is entered by a touch operation outside a pressure threshold/outside a time threshold of the action block 230A. For example, the display of the edit status and the action type list 231 can be distinguished by operating an area inside the action block display area but outside the icon. For another example, an additional edit icon may be provided, and the edit icon may be located in the action block display area or in another location, and the edit icon is operated to enter an edit state. When the edit icon is located outside the action block display area, the action block that needs to be edited is determined by selection of the action block (as indicated by the arrow in the upper right corner of fig. 5).
In the embodiment of the action block where the action type is location, the edit item 230B1 may include four, for example, H denotes update location, I denotes adjust speed/accelerate time, J denotes copy, and K denotes delete.
After the plurality of editing items 230B1 are displayed, the user's operation on the editing item 230B1 (such as left click, right click, touch, long press, pressure press, etc. on the editing item 230B 1) is received, and the editing page of the editing item 230B1 is displayed on one side of the editing item 230B 1.
Taking the operation of editing item 230B1 as an update location as an example, as shown in fig. 6, fig. 6 is a diagram illustrating an expanded editing item in an action block editing state in which the action type is a location according to an embodiment of the present invention. The edit item 230B1 at the updated position is expanded to display the edit page 230B2, the edit page 230B2 displays two icons, L indicates a refresh mode, and M indicates a gesture adjustment. When the refresh mode L is selected, the robot enters an automatic update mode, the robot enters a teaching mode through hardware button operation or software control, the mobile robot enters a position and a posture which are required to be reached, the refresh mode is ended by clicking or pressing a button, the latest current position information is written, and the position information is used as the action parameter of the action block. When the posture adjustment M is selected, entering a robot control interface, and operating the robot to reach a desired position through fine adjustment operation in each direction. In some variations, when an operation is performed on edit item 230B1, the fine tuning interface shown in fig. 11 is entered directly to implement gesture adjustments at the fine tuning interface. The fine tuning interface shows the adjustment of the joint angle, the adjustment of the coordinate space and the fine tuning of the joint (fine tuning components of 6 joints as shown in the lower part of fig. 11). In fig. 11, the robot arm indicated by the solid line is the current position (real position) of the robot arm, and the robot arm indicated by the broken line is the position of the robot arm stored in the motion block, thereby facilitating the comparison and reference by the user.
Taking the edit item 230B1 as an example of the operation of adjusting the speed/acceleration time, as shown in fig. 7, fig. 7 is a schematic diagram of another expanded edit item in the action block editing state where the action type is a position according to an embodiment of the present invention. The speed/acceleration time-adjusted edit item 230B1 is expanded to display an edit page 230B3, and the edit page 230B3 displays a coordinate axis having an abscissa as acceleration time and an ordinate as speed, and receives a user operation on an icon on the abscissa and an icon on the ordinate of the coordinate axis to edit the speed and the acceleration time. In this embodiment, the slope of a straight line formed by connecting the values of the horizontal and vertical coordinates represents the acceleration, so that the motion parameter to be adjusted can be displayed in multiple dimensions through the display of the coordinate axes. The speed and acceleration time adjusting control is a self-designed special control, and is convenient for a user to intuitively operate and set.
In the above embodiment, the functions of copying and deleting can be realized without entering an editing page.
The above description is given by taking the motion type as an example to describe the operation of the motion block, and the present invention is not limited thereto.
In some embodiments of the present invention, step S110 shown in fig. 1 may be followed by a packing operation on at least one action block. Specifically, the first sequence editing area and/or the second sequence editing area may be packaged, which is not limited by the present invention. The following describes the packing operation provided by the present invention, taking the packing operation in the first sequence editing region as an example.
The packing operation may include, for example: receiving packaging operation of a user on at least two action blocks in the first action block sequence, so as to package the at least two selected action blocks into an action package, wherein a plurality of action blocks in the action package are simultaneously operated by the user. Specifically, when the action package is set, the corresponding action package name can be set and the action package circulation operation of a specific number of times can be executed, and the action package can support editing operation consistent with the action block, including copying, deleting, intra-task moving, inter-task moving and the like. The action package can also support any task action block outside the package to be dragged into the action package or any action block inside the package to be dragged out of the action package.
Specifically, for example, the mouse may be dragged by long pressing the left button of the mouse to generate a selected range, so that at least two action blocks within the range are selected. As another example, at least one action block within the range may be selected by a swipe of two fingers to determine two vertices of the selected range. And after the user selects the selected area, the user can confirm and select the package by performing operations such as left click, right click, touch control, long press, pressure press and the like. The invention is not limited in this regard and other ways of selecting and packing action blocks are within the scope of the invention.
Specifically, the operation on the action package directly corresponds to a plurality of action blocks in the action package. For example, when an action packet is deleted, a plurality of action blocks in the action packet are deleted all at once. When the action packet is moved, a plurality of action blocks in the action packet are moved together. When the action packet is copied, the action packet and the internal action block are copied together.
In the above embodiment, the action package has two display modes of the folding display and the unfolding display. As shown in fig. 8 and 9. FIG. 8 shows a schematic diagram of an action bag in an expanded state, according to a specific embodiment of the invention. Fig. 9 is a schematic diagram illustrating a state in which an action package is collapsed according to a specific embodiment of the present invention.
In fig. 8, the action package 251 is in an expanded display mode, the action package 251 includes a plurality of action blocks, and within the expanded display range of the action package, the close icon, the unpack icon, and the action package rename are provided. And entering a folding display mode by operating a closing icon to close the unfolding display mode. Through operation of the unpack icon (located to the right of the "close" icon), the action block within the action package is released as a plurality of individual action blocks. The naming can be changed by editing the action package naming (my action package).
When the extended display mode is turned off and the collapsed display mode is entered, as shown in fig. 9, the action packet 252 has the same display size as the action block when in the collapsed display, so as to display more contents in the same area, and the action packet 252 has a different display color and/or brightness from the action block, so as to be distinguished from the action block. In the retraction display mode, setting of an execution parameter of the action packet by a user can be further received, wherein the execution parameter comprises the number of loop execution times. In the collapsed display mode, an "expand" icon may also be provided. The "expand" icon is manipulated to transition from the collapsed display mode to the expanded display mode (see fig. 8).
Furthermore, the action package can also realize the interactive function of uploading and downloading through a cloud action package platform, users with different roles can upload the action package, and other users can download the action package to complete the whole ecological cycle.
In some embodiments of the present invention, in conjunction with fig. 10, step S110 shown in fig. 1 further includes the following steps: receiving an operation of a user in the second sequence editing area 220 to generate and display a second action block sequence 242, where the second action block sequence 242 includes a plurality of action blocks arranged in a time axis sequence, each action block determines an action type and an action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the second action block sequence 242 provides the peripheral device to execute the program code segment and the action parameter mapped by the action type of each action block according to the time axis sequence, where the first action block sequence 241 interacts with the second action block sequence 242. The present invention does not limit the execution sequence of the steps and the step S120.
Specifically, the interaction of the action parameters is performed according to the action blocks in the first action block sequence 241 and the second action block sequence 242, so that at least part of the action blocks in the first action block sequence 241 are triggered by at least part of the action blocks in the second action block sequence 242; or triggering of at least some of the action blocks in the second sequence of action blocks 242 via at least some of the action blocks in the first sequence of action blocks 241.
In a specific implementation, the first action block sequence 241 and the second action block sequence 242 start to be executed simultaneously, after the first action block of the second action block sequence 242 is executed, the robot executes the motion operation of the first action block sequence 241 while waiting for the "peripheral task notification" signal to be 1, and when the execution of an action block (for example, an action block with an action type of position) of the first action block sequence 241 is completed, the next action block sets the "peripheral task notification" signal to be 1, and then continues to execute downwards; at this time, the waiting of the first action block of the second action block sequence 242 is finished, the second action block sequence 242 starts to execute the second action block of the second action block sequence 242, the "peripheral task complete" signal is set to 1 after the execution of the second action block sequence 242 is finished, the first action block sequence 241 waits for the "peripheral task complete" signal to be 1 at a certain time, and at this time, the multitask interaction and interaction logic of the first action block sequence 241 and the second action block sequence 242 is finished. The foregoing is merely an illustrative description of one specific implementation of the present invention and is not intended to be limiting thereof.
In some implementations of the present invention, the first action block sequence and the second action block sequence may perform corresponding data communication and trigger operation through a semaphore provided by a programming interface or an IO port (a digital or analog port of the robot body and the flange, or a digital or analog port of an external IO control board).
In some embodiments of the present invention, after step S110 shown in fig. 1, the method further includes: and receiving dragging of the action blocks of the user in the first sequence editing area and/or the second sequence editing area so as to change the sequence of the action blocks in the time axis or move the action blocks to another sequence editing area. For example, in fig. 10, the action blocks within the first sequence editing region are moved to the second sequence editing region by dragging the action blocks to change the execution order of the action blocks; or moving the action block in the second sequence editing area to the first sequence editing area. Thus, the order of the program code segments mapped to the operation types of the operation blocks in each operation block sequence is also changed accordingly.
Further, robot programming may be performed in different scenes, and the robot programming interface described in step S110 is provided for each scene, so that the actions of each set of robots may be defined as one scene, and different action sequences correspond to different scenes. In a scene, a plurality of action block sequences may be added, a plurality of time axes may be executed concurrently, or communication between tasks may be performed between time axes of a certain node by IO or semaphore.
The apparatus for robot programming provided by the present invention is described below in conjunction with fig. 12. FIG. 12 illustrates a block diagram of an apparatus for robot programming in accordance with an embodiment of the present invention. The apparatus 300 for robot programming includes an interface providing module 310 and a receiving module 320.
An interface providing module 310 is configured to provide a robot programming interface, which includes at least a first sequence editing area; and
the receiving module 320 is configured to receive an operation of a user in the first sequence editing region to generate and display a first action block sequence, where the first action block sequence includes a plurality of action blocks arranged according to a time axis sequence, each action block determines an action type and an action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the first action block sequence provides for the robot to execute the program code segment and the action parameter mapped by each action block according to the time axis sequence.
In the device for robot programming according to the exemplary embodiment of the present invention, by using the motion blocks and the sequence of motion blocks ordered according to the time axis, the user can simply complete the programming of the whole robot through visual, simple, clear and clear visual and operation interaction modes without the need for the user to understand unsmooth technical terms and programming languages in the robot, on one hand, the use of the robot programming of personal and family users is facilitated; on the other hand, the enterprise human cost is reduced.
The above block diagrams schematically illustrate various embodiments of the present invention, and the combination of blocks and the division of blocks are within the scope of the present invention without departing from the spirit of the present invention. The modules described above may be implemented in software, hardware, firmware, or any combination therebetween.
Referring now to FIG. 13, FIG. 13 illustrates a block diagram of a system for robotic programming in accordance with a specific embodiment of the present invention. The system for robot programming comprises a device 300 for robot programming and a robot 400. Thus, programming of the robot 400 may be achieved through interaction between the device 300 for robot programming and the robot 400. The device 300 for robot programming and the robot 400 may communicate by wire or wirelessly, but the invention is not limited thereto. The robot 400 may read program code segments corresponding to the plurality of action blocks of the first action block sequence in real time according to the time axis, and fill the program code segments with action parameters to execute actions. In some variations, in the process of editing the action blocks, the program code segments corresponding to the plurality of action blocks in the first action block sequence are generated accordingly, and are changed according to the editing or order change of the action blocks. The robot 400 only needs to execute according to the generated program code segments. The invention is not so limited.
The above description is only for the purpose of schematically describing one embodiment of the present invention and will not be repeated herein. In an exemplary embodiment of the invention, a computer-readable storage medium is also provided, on which a computer program is stored which, when being executed by a processor, for example, is able to carry out the steps of the method for robot programming as described in any of the above embodiments. In some possible embodiments, the various aspects of the invention may also be implemented in the form of a program product comprising program code means for causing a terminal device to carry out the steps according to various exemplary embodiments of the invention described in the method part for robot programming mentioned above in this description, when said program product is run on the terminal device.
Referring to fig. 14, a program product 700 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + +, Golang, etc., as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the tenant computing device, partly on the tenant device, as a stand-alone software package, partly on the tenant computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing devices may be connected to the tenant computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
In an exemplary embodiment of the invention, there is also provided an electronic device that may include a processor and a memory for storing executable instructions of the processor. Wherein the processor is configured to perform the steps of the method for robot programming described in any of the above embodiments via execution of the executable instructions. Further, the electronic device further comprises one or more of the following devices: an Ethernet network card; a wireless network card; an IO controller; a communication circuit. The two network connection devices of the Ethernet network card and the wireless network card are used for controlling the robot through a wired network and a wireless network and modifying corresponding programming through operation; the IO controller can exchange signals and data with other external equipment through the input and output of the digital port and the analog port; and the core communication circuit is used for communication connection and data interaction between the upper-layer computer equipment and the robot. The invention is not so limited.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 800 according to this embodiment of the invention is described below with reference to fig. 15. The electronic device 800 shown in fig. 15 is only an example and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 15, the electronic device 800 is embodied in the form of a general purpose computing device. The components of the electronic device 800 may include, but are not limited to: at least one processing unit 810, at least one memory unit 820, a bus 830 connecting the various system components (including the memory unit 820 and the processing unit 810), a display unit 840, and the like.
Wherein the memory unit stores program code that can be executed by the processing unit 810, causing the processing unit 810 to perform the steps according to various exemplary embodiments of the present invention described in the method part for robot programming described above in this specification. For example, the processing unit 810 may perform the steps shown in fig. 1.
The memory unit 820 may include readable media in the form of volatile memory units such as a random access memory unit (RAM)8201 and/or a cache memory unit 8202, and may further include a read only memory unit (ROM) 8203.
The memory unit 820 may also include a program/utility 8204 having a set (at least one) of program modules 8205, such program modules 8205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 800 may also communicate with one or more external devices 900 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a tenant to interact with the electronic device 800, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 800 to communicate with one or more other computing devices. Such communication may occur via input/output (I/O) interfaces 850. Also, the electronic device 800 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the internet) via the network adapter 860. The network adapter 860 may communicate with other modules of the electronic device 800 via the bus 830. It should be appreciated that although not shown, other hardware and/or software modules may be used in conjunction with the electronic device 800, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiment of the present invention can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to make a computing device (which can be a personal computer, a server, or a network device, etc.) execute the above-mentioned method for robot programming according to the embodiment of the present invention.
Compared with the prior art, the invention has the advantages that:
by utilizing the action blocks and the action block sequence sequenced according to the time axis, a user can simply complete the programming of the whole robot through visual, simple, clear and clear visual and operation interaction modes without the need of understanding unsmooth professional terms and programming languages in the robot, so that the robot programming of personal and family users is facilitated; on the other hand, the enterprise human cost is reduced.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
Claims (15)
1. A method for robot programming, comprising:
providing a robotic programming interface, the robotic programming interface including at least a first sequence editing region; and
receiving an operation of a user in the first sequence editing area to generate and display a first action block sequence, wherein the first action block sequence comprises a plurality of action blocks arranged according to a time axis sequence, each action block determines an action type and an action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the first action block sequence is used for the robot to execute the program code segment and the action parameter mapped by the action type of each action block according to the time axis sequence.
2. The method for robotic programming as defined in claim 1, wherein the robotic programming interface further includes a second sequence editing area, the providing the robotic programming interface further comprising, after:
receiving an operation of a user in the second sequence editing area to generate and display a second action block sequence, wherein the second action block sequence comprises a plurality of action blocks arranged according to a time axis sequence, each action block determines an action type and an action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the second action block sequence is used for the peripheral equipment to execute the program code segment and the action parameter mapped by the action type of each action block according to the time axis sequence,
wherein the first sequence of action blocks interacts with the second sequence of action blocks.
3. The method for robot programming of claim 2, wherein the first sequence of action blocks interacting with the second sequence of action blocks comprises:
performing interaction of action parameters according to the action blocks in the first action block sequence and the second action block sequence, so that at least part of action blocks in the first action block sequence are triggered by at least part of action blocks in the second action block sequence; or at least part of the action blocks in the second action block sequence are triggered by at least part of the action blocks in the first action block sequence.
4. The method for robotic programming as set forth in claim 2, wherein said providing a robotic programming interface further comprises:
receiving a packaging operation of a user on at least two action blocks in the first action block sequence or the second action block sequence, so as to package the at least two selected action blocks into an action package, wherein a plurality of action blocks in the action package are simultaneously operated by the user.
5. The method for robot programming of claim 4, wherein the action package has two display modes, a stow display and an expand display, the action package having the same display size as the action block when in the stow display, the action package having a different display color and/or brightness than the action block.
6. The method for robot programming of claim 4, wherein the receiving a user packaging operation for at least two motion blocks of the first sequence of motion blocks or the second sequence of motion blocks to package the selected at least two motion blocks into an motion package further comprises:
and receiving the setting of the user on the execution parameters of the action packet, wherein the execution parameters comprise the cycle times of cycle execution.
7. The method for robotic programming as set forth in claim 2, wherein said providing a robotic programming interface further comprises:
and receiving dragging of the action blocks of the user in the first sequence editing area and/or the second sequence editing area so as to change the sequence of the action blocks in the time axis or move the action blocks to another sequence editing area.
8. A method for robot programming according to any of claims 2 to 7, wherein said providing a robot programming interface is followed by:
receiving an adding operation of a user in the first sequence editing area and/or the second sequence editing area so as to add an action block in the first sequence editing area and/or the second sequence editing area;
receiving the operation of the user on the added action block, and displaying an action type list on the added action block; and
and receiving the operation of the user on the action type list, and determining the action type of the added action block.
9. A method for robot programming according to claim 8, characterized in that the action types comprise at least position, semaphore, digital IO, analog IO, message prompt, wait, load configuration, gripper, user coordinate action.
10. The method for robotic programming as set forth in claim 9, wherein said providing a robotic programming interface further comprises:
receiving an editing operation of a user on an action block, wherein the action block is displayed in an editing state, and a plurality of editing items are displayed on the edge of an action block display area of the action block in the editing state;
receiving the operation of the user on the editing item, and displaying an editing page of the editing item on one side of the editing item; and
and receiving the operation of the user on the editing page, and editing the action parameters of the action block.
11. The method for robotic programming as claimed in claim 10, wherein the edit item includes at least a speed and an acceleration time when the action type is a position, the edit page displays a coordinate axis with an abscissa of the coordinate axis being the acceleration time and an ordinate of the coordinate axis being the speed, and the receiving the user's operation on the edit item includes: and receiving the operation of the user on the icon on the abscissa and the icon on the ordinate of the coordinate axis so as to edit the speed and the acceleration time.
12. An apparatus for robotic programming, comprising:
an interface providing module for providing a robot programming interface, wherein the robot programming interface at least comprises a first sequence editing area; and
the receiving module is used for receiving the operation of a user in the first sequence editing area to generate and display a first action block sequence, the first action block sequence comprises a plurality of action blocks which are arranged according to a time axis sequence, each action block determines the action type and the action parameter of the action block according to the user operation, each action type maps a corresponding program code segment, and the first action block sequence is used for the robot to execute the program code segment and the action parameter mapped by the action type of each action block according to the time axis sequence.
13. A system for robotic programming, comprising:
means for robotic programming as claimed in claim 12; and
a robot.
14. An electronic device, characterized in that the electronic device comprises:
a processor;
a memory having stored thereon a computer program which, when executed by the processor, performs a method for robot programming according to any of claims 1 to 11;
the electronic device further comprises one or more of the following:
an Ethernet network card; a wireless network card; an IO controller; a communication circuit.
15. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the method for robot programming according to any one of claims 1 to 11.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201911162546.6A CN110955421A (en) | 2019-11-22 | 2019-11-22 | Method, system, electronic device, storage medium for robot programming |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201911162546.6A CN110955421A (en) | 2019-11-22 | 2019-11-22 | Method, system, electronic device, storage medium for robot programming |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN110955421A true CN110955421A (en) | 2020-04-03 |
Family
ID=69978172
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201911162546.6A Pending CN110955421A (en) | 2019-11-22 | 2019-11-22 | Method, system, electronic device, storage medium for robot programming |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN110955421A (en) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111552238A (en) * | 2020-04-17 | 2020-08-18 | 达闼科技(北京)有限公司 | Robot control method, device, computing equipment and computer storage medium |
| CN112099775A (en) * | 2020-09-15 | 2020-12-18 | 上海岭先机器人科技股份有限公司 | Method for coding garment manufacturing process flow |
| CN112328238A (en) * | 2021-01-05 | 2021-02-05 | 深圳点猫科技有限公司 | Building block code execution control method, system and storage medium |
| CN112965709A (en) * | 2021-04-06 | 2021-06-15 | 乐聚(深圳)机器人技术有限公司 | Building block generation method, building block generation device, building block generation equipment and storage medium |
| CN114211527A (en) * | 2021-12-30 | 2022-03-22 | 上海清芸机器人有限公司 | Continuous action semantic coding and translating method and system of humanoid entity robot |
| CN116400637A (en) * | 2023-03-17 | 2023-07-07 | 北京麦乐程物联技术有限公司 | Programmable electronic device and programming method applied to programmable electronic device |
| CN116560640A (en) * | 2023-07-05 | 2023-08-08 | 深圳墨影科技有限公司 | Visual editing system and method based on robot design system |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3243607A1 (en) * | 2016-05-09 | 2017-11-15 | OpiFlex Automation AB | A system and a method for programming an industrial robot |
| CN107678743A (en) * | 2017-09-27 | 2018-02-09 | 北京酷思倍科技有限公司 | A kind of method for intelligent robot programming |
| CN108268255A (en) * | 2018-02-11 | 2018-07-10 | 遨博(北京)智能科技有限公司 | For programming the method and apparatus of robot |
-
2019
- 2019-11-22 CN CN201911162546.6A patent/CN110955421A/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3243607A1 (en) * | 2016-05-09 | 2017-11-15 | OpiFlex Automation AB | A system and a method for programming an industrial robot |
| CN107678743A (en) * | 2017-09-27 | 2018-02-09 | 北京酷思倍科技有限公司 | A kind of method for intelligent robot programming |
| CN108268255A (en) * | 2018-02-11 | 2018-07-10 | 遨博(北京)智能科技有限公司 | For programming the method and apparatus of robot |
Non-Patent Citations (1)
| Title |
|---|
| WEIXIN_34247032: "《计算机科学与工程导论:基于IoT和机器人的可视化编程实践方法第2版》", 《HTTPS://BLOG.CSDN.NET/WEIXIN_34247032/ARTICLE/DETAILS/90363510》 * |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111552238A (en) * | 2020-04-17 | 2020-08-18 | 达闼科技(北京)有限公司 | Robot control method, device, computing equipment and computer storage medium |
| CN112099775A (en) * | 2020-09-15 | 2020-12-18 | 上海岭先机器人科技股份有限公司 | Method for coding garment manufacturing process flow |
| CN112099775B (en) * | 2020-09-15 | 2023-09-01 | 上海岭先机器人科技股份有限公司 | Method for coding clothes making process flow |
| CN112328238A (en) * | 2021-01-05 | 2021-02-05 | 深圳点猫科技有限公司 | Building block code execution control method, system and storage medium |
| CN112965709A (en) * | 2021-04-06 | 2021-06-15 | 乐聚(深圳)机器人技术有限公司 | Building block generation method, building block generation device, building block generation equipment and storage medium |
| CN114211527A (en) * | 2021-12-30 | 2022-03-22 | 上海清芸机器人有限公司 | Continuous action semantic coding and translating method and system of humanoid entity robot |
| CN116400637A (en) * | 2023-03-17 | 2023-07-07 | 北京麦乐程物联技术有限公司 | Programmable electronic device and programming method applied to programmable electronic device |
| CN116560640A (en) * | 2023-07-05 | 2023-08-08 | 深圳墨影科技有限公司 | Visual editing system and method based on robot design system |
| CN116560640B (en) * | 2023-07-05 | 2024-01-02 | 深圳墨影科技有限公司 | Visual editing system and method based on robot design system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN110955421A (en) | Method, system, electronic device, storage medium for robot programming | |
| US11562544B2 (en) | Transferring graphic objects between non-augmented reality and augmented reality media domains | |
| US8584047B2 (en) | Orbital representation of hierarchical navigation | |
| JP3898235B2 (en) | How to provide a graphical user interface | |
| US8762869B2 (en) | Reduced complexity user interface | |
| US7917868B2 (en) | Three-dimensional motion graphic user interface and method and apparatus for providing the same | |
| US7765496B2 (en) | System and method for improving the navigation of complex visualizations for the visually impaired | |
| CN103229141A (en) | Managing workspaces in a user interface | |
| EP3514672A1 (en) | System and method for managing digital content items | |
| CN112527224B (en) | Spliced screen video layout device, method, terminal device, system and medium | |
| WO2015126433A1 (en) | Navigating a hierarchal data set | |
| US9146667B2 (en) | Electronic device, display system, and method of displaying a display screen of the electronic device | |
| TW201622917A (en) | Method for generating robot operation program, and device for generating robot operation program | |
| US10395413B2 (en) | Dynamic user interfaces | |
| JP2019087284A (en) | Dialogue method for user interface | |
| US9268477B2 (en) | Providing contextual menus | |
| US11182073B2 (en) | Selection on user interface based on cursor gestures | |
| JP5875555B2 (en) | Image creation system | |
| JP5488930B2 (en) | Housework plan creation support device and housework plan creation support method | |
| CN111061465B (en) | Reverse mapping method, system, electronic equipment and storage medium for robot programming | |
| CN113805849A (en) | Robot action pre-compiling method and device and electronic equipment | |
| Freitag et al. | Liquid: Library for Interactive User Interface Development. | |
| JP6854785B2 (en) | User interface design device | |
| CN117234403A (en) | Interaction method and electronic equipment | |
| JP2668606B2 (en) | User interface creation system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| WD01 | Invention patent application deemed withdrawn after publication | ||
| WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200403 |