[go: up one dir, main page]

Urgo et al., 2019 - Google Patents

A human modelling and monitoring approach to support the execution of manufacturing operations

Urgo et al., 2019

View PDF
Document ID
7491004629991301940
Author
Urgo M
Tarabini M
Tolio T
Publication year
Publication venue
CIRP Annals

External Links

Snippet

Human workers have a vital role in manufacturing given their adaptability to varying environmental conditions, their capability of judgment and understanding of the context. Nevertheless, the increasing complexity and variety of manufacturing operations ask for the …
Continue reading at re.public.polimi.it (PDF) (other versions)

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/62Methods or arrangements for recognition using electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N99/00Subject matter not provided for in other groups of this subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computer systems utilising knowledge based models

Similar Documents

Publication Publication Date Title
Urgo et al. A human modelling and monitoring approach to support the execution of manufacturing operations
Daftry et al. Introspective perception: Learning to predict failures in vision systems
Neto et al. Real-time and continuous hand gesture spotting: An approach based on artificial neural networks
Tuli et al. Knowledge-based digital twin for predicting interactions in human-robot collaboration
Wang et al. Collision-free trajectory planning in human-robot interaction through hand movement prediction from vision
Shu et al. Learning social affordance grammar from videos: Transferring human interactions to human-robot interactions
Gorecky et al. COGNITO: a cognitive assistance and training system for manual tasks in industry
Bütepage et al. Anticipating many futures: Online human motion prediction and synthesis for human-robot collaboration
Angleraud et al. Sensor-based human–robot collaboration for industrial tasks
Manns et al. Identifying human intention during assembly operations using wearable motion capturing systems including eye focus
Hak et al. Reverse control for humanoid robot task recognition
Zhang et al. Bio-inspired predictive orientation decomposition of skeleton trajectories for real-time human activity prediction
Hamabe et al. A programming by demonstration system for human-robot collaborative assembly tasks
Pal et al. Dynamic hand gesture recognition using kinect sensor
Cai et al. FedHIP: Federated learning for privacy-preserving human intention prediction in human-robot collaborative assembly tasks
Kozamernik et al. Visual quality and safety monitoring system for human-robot cooperation
Avogaro et al. Exploring 3D Human Pose Estimation and Forecasting from the Robot’s Perspective: The HARPER Dataset
Kelley et al. An architecture for understanding intent using a novel hidden markov formulation
Tsitos et al. Real-time feasibility of a human intention method evaluated through a competitive human-robot reaching game
Park et al. Hmpo: Human motion prediction in occluded environments for safe motion planning
Romeo et al. Multimodal data extraction and analysis for the implementation of Temporal Action Segmentation models in Manufacturing
Pavllo et al. Real-time marker-based finger tracking with neural networks
Juett et al. Learning to reach by building a representation of peri-personal space
Jacoby et al. Understanding dynamic human intentions to enhance collaboration performance for human-robot partnerships
Cai et al. Hierarchical Deep Learning for Intention Estimation of Teleoperation Manipulation in Assembly Tasks