Urgo et al., 2019 - Google Patents
A human modelling and monitoring approach to support the execution of manufacturing operationsUrgo et al., 2019
View PDF- Document ID
- 7491004629991301940
- Author
- Urgo M
- Tarabini M
- Tolio T
- Publication year
- Publication venue
- CIRP Annals
External Links
Snippet
Human workers have a vital role in manufacturing given their adaptability to varying environmental conditions, their capability of judgment and understanding of the context. Nevertheless, the increasing complexity and variety of manufacturing operations ask for the …
- 241000282414 Homo sapiens 0 title abstract description 38
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Urgo et al. | A human modelling and monitoring approach to support the execution of manufacturing operations | |
| Daftry et al. | Introspective perception: Learning to predict failures in vision systems | |
| Neto et al. | Real-time and continuous hand gesture spotting: An approach based on artificial neural networks | |
| Tuli et al. | Knowledge-based digital twin for predicting interactions in human-robot collaboration | |
| Wang et al. | Collision-free trajectory planning in human-robot interaction through hand movement prediction from vision | |
| Shu et al. | Learning social affordance grammar from videos: Transferring human interactions to human-robot interactions | |
| Gorecky et al. | COGNITO: a cognitive assistance and training system for manual tasks in industry | |
| Bütepage et al. | Anticipating many futures: Online human motion prediction and synthesis for human-robot collaboration | |
| Angleraud et al. | Sensor-based human–robot collaboration for industrial tasks | |
| Manns et al. | Identifying human intention during assembly operations using wearable motion capturing systems including eye focus | |
| Hak et al. | Reverse control for humanoid robot task recognition | |
| Zhang et al. | Bio-inspired predictive orientation decomposition of skeleton trajectories for real-time human activity prediction | |
| Hamabe et al. | A programming by demonstration system for human-robot collaborative assembly tasks | |
| Pal et al. | Dynamic hand gesture recognition using kinect sensor | |
| Cai et al. | FedHIP: Federated learning for privacy-preserving human intention prediction in human-robot collaborative assembly tasks | |
| Kozamernik et al. | Visual quality and safety monitoring system for human-robot cooperation | |
| Avogaro et al. | Exploring 3D Human Pose Estimation and Forecasting from the Robot’s Perspective: The HARPER Dataset | |
| Kelley et al. | An architecture for understanding intent using a novel hidden markov formulation | |
| Tsitos et al. | Real-time feasibility of a human intention method evaluated through a competitive human-robot reaching game | |
| Park et al. | Hmpo: Human motion prediction in occluded environments for safe motion planning | |
| Romeo et al. | Multimodal data extraction and analysis for the implementation of Temporal Action Segmentation models in Manufacturing | |
| Pavllo et al. | Real-time marker-based finger tracking with neural networks | |
| Juett et al. | Learning to reach by building a representation of peri-personal space | |
| Jacoby et al. | Understanding dynamic human intentions to enhance collaboration performance for human-robot partnerships | |
| Cai et al. | Hierarchical Deep Learning for Intention Estimation of Teleoperation Manipulation in Assembly Tasks |