Lehrstuhl FAMS Logo
Uni Siegen Logo
Bild Automatisierung
PROTECH Institutslogo

HiSMoT

HIGH-SPEED MOTION TRACKING AND COUPLING FOR HUMAN-ROBOT COLLABORATIVE ASSEMBLY TASKS —

Enabling robots learn how to collaborate from humans


Funding: German Research Foundation (DFG)
Grant number: 500490184
Project duration: 01.09.2023 - 31.12.2025
Collaborating University: Ruhr Universität Bochum (RUB), Chair for Production System (LPS)

Background

Today, the capabilities of human-robot collaboration (HRC) are widely investigated for manufacturing processes, mainly in lab environments. However, they are not well applied in production systems. Collaborative handling of an object is one of the current challenges regarding HRC’s real collaboration capabilities. In this context, it is still unclear how robots and humans should behave to handle an object and anticipate for mutual care. The speed of the robot tool center point (TCP) that depends on the human motion might create high joints speeds, which can create potential risk to human body parts. In this regard, robots have been represented using kinematic and dynamic models of rigid bodies that yield a structured and controlled motion profile compared to data-driven human motion generation techniques.

Project objective

To investigate real-time HRC motion coupling and harmonization methods that highly anticipate human motion behavior with high mutual care.

Method

Design of experiment

demo Figure:

Research question

In HiSMoT aspect, the following basic questions may lead to investigation;

Preliminary results

Digital twin experimental setup for collaborating two humans for object handling

Publications

Conference Articles

  1. Raza, S.M., Tuli, T.B., and Manns, M., 2024. Human Action Sequence Prediction for (Re)Configuring Machine Tools. Procedia CIRP, 130, 1170–1175. (doi)
  2. Saeed, R., Tuli, T.B., Habersang, T., Weikum, M., Kuhlenkötter, B., and Manns, M., 2024. Human motion generation using latent space constraint for manual assembly tasks. In: WGP Jahreskongress 2024 (under production).
  3. Saeed, R., Tuli, T.B, Weikum, M., and Manns, M., 2025. Towards trajectory-based latent space control for human-robot collaboration. In: Proceedings of the 58th CIRP Conference on Manufacturing Systems. (Accepted).
  4. Saeed, R., Tuli, T.B., Manns, M., 2025. Experimental setup for motion capturing two humans jointly handling an object. In: Proceedings of the 58th CIRP Conference on Manufacturing Systems. (Accepted).
  5. Habersang, T., Saeed, R., Manns, M. and Kuhlenkötter, B., 2025. Motion harmonization for human-robot collaborative handling tasks. In: Procedia CIRP Design. (Accepted)

Book chapter

  1. Manns, M., Tuli, T.B., 2025. Digital human modeling for human robot interaction (Accepted for publication). Springer Nature.

Miscellaneous

  1. Tuli, T.B., Saeed, R., and Manns, M., 2024. Experimental design for human-to-human collaboration during collaborative object handling using digital twin (Poster presentation, ZESS PhD Forum).
  2. Tuli, T.B., Saeed, R., and Manns, M., 2024. Target position-constrained ‘reach’ human motion exhibits either Gaussian or Gaussian Mixture (Poster presentation, ZESS Open day).

Project team

HiSMoT Logo
Funding DFG Logo

HiSMoT

HIGH-SPEED MOTION TRACKING AND COUPLING FOR HUMAN-ROBOT COLLABORATIVE ASSEMBLY TASKS —

Enabling robots learn how to collaborate from humans


Funding: German Research Foundation (DFG)
Grant number: 500490184
Project duration: 01.09.2023 - 31.12.2025
Collaborating University: Ruhr Universität Bochum (RUB), Chair for Production System (LPS)

Background

Today, the capabilities of human-robot collaboration (HRC) are widely investigated for manufacturing processes, mainly in lab environments. However, they are not well applied in production systems. Collaborative handling of an object is one of the current challenges regarding HRC’s real collaboration capabilities. In this context, it is still unclear how robots and humans should behave to handle an object and anticipate for mutual care. The speed of the robot tool center point (TCP) that depends on the human motion might create high joints speeds, which can create potential risk to human body parts. In this regard, robots have been represented using kinematic and dynamic models of rigid bodies that yield a structured and controlled motion profile compared to data-driven human motion generation techniques.

Project objective

To investigate real-time HRC motion coupling and harmonization methods that highly anticipate human motion behavior with high mutual care.

Method

Design of experiment

demo Figure:

Research question

In HiSMoT aspect, the following basic questions may lead to investigation;

Preliminary results

Digital twin experimental setup for collaborating two humans for object handling

Publications

Conference Articles

  1. Raza, S.M., Tuli, T.B., and Manns, M., 2024. Human Action Sequence Prediction for (Re)Configuring Machine Tools. Procedia CIRP, 130, 1170–1175. (doi)
  2. Saeed, R., Tuli, T.B., Habersang, T., Weikum, M., Kuhlenkötter, B., and Manns, M., 2024. Human motion generation using latent space constraint for manual assembly tasks. In: WGP Jahreskongress 2024 (under production).
  3. Saeed, R., Tuli, T.B, Weikum, M., and Manns, M., 2025. Towards trajectory-based latent space control for human-robot collaboration. In: Proceedings of the 58th CIRP Conference on Manufacturing Systems. (Accepted).
  4. Saeed, R., Tuli, T.B., Manns, M., 2025. Experimental setup for motion capturing two humans jointly handling an object. In: Proceedings of the 58th CIRP Conference on Manufacturing Systems. (Accepted).
  5. Habersang, T., Saeed, R., Manns, M. and Kuhlenkötter, B., 2025. Motion harmonization for human-robot collaborative handling tasks. In: Procedia CIRP Design. (Accepted)

Book chapter

  1. Manns, M., Tuli, T.B., 2025. Digital human modeling for human robot interaction (Accepted for publication). Springer Nature.

Miscellaneous

  1. Tuli, T.B., Saeed, R., and Manns, M., 2024. Experimental design for human-to-human collaboration during collaborative object handling using digital twin (Poster presentation, ZESS PhD Forum).
  2. Tuli, T.B., Saeed, R., and Manns, M., 2024. Target position-constrained ‘reach’ human motion exhibits either Gaussian or Gaussian Mixture (Poster presentation, ZESS Open day).

Project team

HiSMoT Logo
Funding DFG Logo