An Application of a Wearable Device with Motion-Capture and Haptic-Feedback for Human–Robot Collaboration | SpringerLink
Skip to main content

An Application of a Wearable Device with Motion-Capture and Haptic-Feedback for Human–Robot Collaboration

  • Conference paper
  • First Online:
Product Lifecycle Management. PLM in Transition Times: The Place of Humans and Transformative Technologies (PLM 2022)

Abstract

Research on human–machine collaboration in Industry 5.0 has attracted significant attention in the manufacturing sector. Although human–robot collaboration can improve work efficiency and productivity, the design of its process is time consuming and cost intensive. The digitalization of machines facilitates automation and improves the intelligence of mechanical tasks. However, the digitalization of humans to improve the intelligence of operation functions for workers is difficult. To solve the difficulty of human digitalization, this study designed a human–robot interaction application based on motion capture using a digital human and virtual robot. The proposed framework supports process managers and shop-floor workers. Process managers can design the optimal collaborative process by interacting with robots and identifying their movements in the virtual world. Shop-floor workers can avoid collision accidents with robots by checking the future movements of the robot in the virtual world, on the basis of which they become proficient in the collaborative task to be actually performed in the physical world. An experiment was conducted on a virtual shop-floor that was modeled based on a physical shop-floor. The experimental results showed that a worker can avoid collision with the help of the proposed framework. Thus, the proposed framework can prevent collisions and accidents during the human–robot collaboration process in the real world.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 17159
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 21449
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
JPY 21449
Price includes VAT (Japan)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Favi, C., Germani, M., Marconi, M.: A 4M approach for a comprehensive analysis and improvement of manual assembly lines. Procedia Manuf. 11, 1510–1518 (2017)

    Article  Google Scholar 

  2. Kampa, A., Gołda, G., Paprocka, I.: Discrete event simulation method as a tool for improvement of manufacturing systems. Computers 6(1), 10 (2017)

    Article  Google Scholar 

  3. Ke, Q., Liu, J., Bennamoun, M., An, S., Sohel, F., Boussaid, F.: Computer vision for human–machine interaction. In: Computer Vision for Assistive Healthcare, pp. 127–145. Academic Press (2018)

    Google Scholar 

  4. Degani, A., Heymann, M.: Formal verification of human-automation interaction. Hum. Factors 44(1), 28–43 (2002)

    Article  MATH  Google Scholar 

  5. O’malley, M.K.: Principles of human-machine interfaces and interactions. Life Sci. Autom.: Fundam. Appl., 101–125 (2007)

    Google Scholar 

  6. Wang, X.V., Kemény, Z., Váncza, J., Wang, L.: Human–robot collaborative assembly in cyber-physical production: classification framework and implementation. CIRP Ann. 66(1), 5–8 (2017)

    Article  Google Scholar 

  7. Liu, H., Wang, L.: Collision-free human-robot collaboration based on context awareness. Robot. Comput.-Integr. Manuf. 67, 101977 (2021)

    Article  Google Scholar 

  8. Fryman, J., Matthias, B.: Safety of industrial robots: from conventional to collaborative applications. In: ROBOTIK 2012: 7th German Conference on Robotics, pp. 1–5. VDE (2012)

    Google Scholar 

  9. Kawamura, R.: A digital world of humans and society—digital twin computing. NTT Tech. Rev. 18(3), 11–17 (2019)

    Google Scholar 

  10. Mourtzis, D., Angelopoulos, J., Panopoulos, N.: Operator 5.0: a survey on enabling technologies and a framework for digital manufacturing based on extended reality. J. Mach. Eng. 22, 43–69 (2022)

    Article  Google Scholar 

  11. Romero, D., Stahre, J.: Towards the resilient operator 5.0: the future of work in smart resilient manufacturing systems. Procedia CIRP 104, 1089–1094 (2021)

    Article  Google Scholar 

  12. Menache, A.: Understanding Motion Capture for Computer Animation and Video Games. Morgan Kaufmann, Burlington (2000)

    Google Scholar 

  13. Menolotto, M., Komaris, D.S., Tedesco, S., O’Flynn, B., Walsh, M.: Motion capture technology in industrial applications: a systematic review. Sensors 20(19), 5687 (2020)

    Article  Google Scholar 

  14. Bortolini, M., Faccio, M., Gamberi, M., Pilati, F.: Motion analysis system (MAS) for production and ergonomics assessment in the manufacturing processes. Comput. Ind. Eng. 139, 105485 (2020)

    Article  Google Scholar 

  15. Nam, Y.W., Lee, S.H., Lee, D.G., Im, S.J., Noh, S.D.: Digital twin-based application for design of human-machine collaborative assembly production lines. J. Korean Inst. Ind. Eng. 46(1), 42–54 (2020)

    Google Scholar 

  16. Geiselhart, F., Otto, M., Rukzio, E.: On the use of multi-depth-camera based motion tracking systems in production planning environments. Procedia CIRP 41, 759–764 (2016)

    Article  Google Scholar 

  17. Jun, C., Lee, J.Y., Noh, S.D.: A study on modeling automation of human engineering simulation using multi kinect depth cameras. Korean J. Comput. Des. Eng. 21(1), 9–19 (2016)

    Article  Google Scholar 

  18. Teslasuit. https://teslasuit.io. Accessed 23 Mar 2022

  19. Balan, L., Bone, G. M.: Real-time 3D collision avoidance method for safe human and robot coexistence. In: 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 276–282 IEEE (2006)

    Google Scholar 

  20. Niryo One Mechanica Specifications. https://niryo.com. Accessed 23 Mar 2022

  21. Li, F., Huang, Z., Xu, L.: Path planning of 6-DOF venipuncture robot arm based on improved a-star and collision detection algorithms. In: 2019 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp.2971–2976. IEEE (2019)

    Google Scholar 

  22. Secil, S., Ozkan, M.: Minimum distance calculation using skeletal tracking for safe human-robot interaction. Robot. Comput.-Integr. Manuf. 73, 102253 (2022)

    Article  Google Scholar 

  23. https://developer.teslasuit.io. Accessed 23 Mar 2022

  24. Unity-Robotics. https://github.com/Unity-Technologies/Unity-Robotics-Hub. Accessed 23 Mar 2022

Download references

Acknowledgement

This work was supported by project for Smart Manufacturing Innovation R&D funded Korea Ministry of SMEs and Startups (Project No. RS-2022–00140261) and supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No.2022–0-00866, Development of cyber-physical manufacturing base technology that supports high-fidelity and distributed simulation for large-scalability).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to San Do Noh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yun, J., Kim, GY., Sajadieh, M., Yang, J., Kim, D., Do Noh, S. (2023). An Application of a Wearable Device with Motion-Capture and Haptic-Feedback for Human–Robot Collaboration. In: Noël, F., Nyffenegger, F., Rivest, L., Bouras, A. (eds) Product Lifecycle Management. PLM in Transition Times: The Place of Humans and Transformative Technologies. PLM 2022. IFIP Advances in Information and Communication Technology, vol 667. Springer, Cham. https://doi.org/10.1007/978-3-031-25182-5_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-25182-5_36

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-25181-8

  • Online ISBN: 978-3-031-25182-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics