Research Article
BibTex RIS Cite

DEEP LEARNING BASED HUMAN ROBOT INTERACTION WITH 5G COMMUNICATION

Year 2023, Volume: 11 Issue: 2, 423 - 438, 01.06.2023
https://doi.org/10.36306/konjes.1228275

Abstract

Factories focusing on digital transformation accelerate their production and surpass their competitors by increasing their controllability and efficiency. In this study, the data obtained by image processing with the aim of digital transformation was transferred to the collaborative robot arm with 5G communication and the robot arm was remotely controlled. A 3D-printed humanoid hand is mounted on the end of the robot arm for bin picking. Each finger is controlled by five servo motors. For finger control, the user wore a glove, and the finger positions of the user were transferred to the servo motors thanks to each flex sensor attached to the glove. In this way, the desired pick and place process is provided. The position control of the robot arm was realized with image processing. The gloves worn by the user were determined by two different YOLO (You only look once) methods. YOLOv4 and YOLOv5 algorithms were compared by using Python software language in object detection. While the highest detection accuracy obtained with the YOLOv4 algorithm during the test phase was 99.75% in the front camera, it was 99.83% in the YOLOv5 algorithm; YOLOv4 detection accuracy was the highest in the side camera of 97.59%, and YOLOv5 detection accuracy was 97.9%.

Supporting Institution

Mitsubishi Electric Türkiye

Project Number

No project number

References

  • [1] X. Chen, X. Huang, Y. Wang, and X. Gao, "Combination of augmented reality based brain- computer interface and computer vision for high-level control of a robotic arm," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 12, pp. 3140-3147, 2020.
  • [2] Z. Zhang, Y. Huang, S. Chen, J. Qu, X. Pan, T. Yu, and Y. Li, "An intention-driven semi- autonomous intelligent robotic system for drinking," Frontiers in Neurorobotics, vol. 11, p. 1-14, 2017.
  • [3] S. M. Achari, S. G. Mirji, C. P. Desai, M. S. Hulasogi, and S. P. Awari, "Gesture based wireless control of robotic hand using image processing," International Research Journal of Engineering and Technology, vol. 5, no. 5, pp. 3340-3345, 2018.
  • [4] J. O. P. Arenas, R. J. Moreno, and R. D. H. Beleño, "Convolutional neural network with a dag architecture for control of a robotic arm by means of hand gestures," Contemporary Engineering Sciences, vol. 11, no. 12, pp. 547-557, 2018.
  • [5] P. Atre, S. Bhagat, N. Pooniwala, and P. Shah, "Efficient and feasible gesture controlled robotic arm," in 2018 Second International Conference on Intelligent Computing and Control Systems, 2018, pp. 1-6: IEEE.
  • [6] A. A. Malik and A. J. R. Brem, "Digital twins for collaborative robots: A case study in human- robot interaction," Robotics and Computer-Integrated Manufacturing, vol. 68, pp. 1-16, 2021.
  • [7] J. P. Vasconez, G. A. Kantor, and F. A. A. Cheein, "Human–robot interaction in agriculture: A survey and current challenges," Biosystems engineering, vol. 179, pp. 35-48, 2019.
  • [8] K. Fujii, G. Gras, A. Salerno, and G.-Z. Yang, "Gaze gesture based human robot interaction for laparoscopic surgery," Medical image analysis, vol. 44, pp. 196-214, 2018.
  • [9] M. N. Datta, Y. Rathi, and M. Eliazer, "Wheat heads detection using deep learning algorithms," Annals of the Romanian Society for Cell Biology, vol. 25, no.5, pp. 5641-5654, 2021.
  • [10] F. Jubayer, J. A. Soeb, A. N. Mojumder, M. K. Paul, P. Barua, S. Kayshar, S. S. Akter, M. Rahman, and A. Islam, "Detection of mold on the food surface using YOLOv5," Detection of mold on the food surface using YOLOv5, vol. 4, pp. 724-728, 2021.
  • [11] M. Karakaya, M. F. Celebi, A. E. Gök, and S. Ersoy, "Discovery Of Agricultural Diseases By Deep Learning And Object Detection," Environmental Engineering and Management Journal, vol. 21, no. 1, pp. 163-173, 2022.
  • [12] R. Li and Y. J. E. Wu, "Improved YOLO v5 Wheat Ear Detection Algorithm Based on Attention Mechanism," Electronics, vol. 11, no. 11, pp. 1673-1694, 2022.
  • [13] C. A. Owusu-Agyei and J. Hou, "Hands Activities Detection in Egocentric Interactions Using YOLOv5," in 2021 International Conference on UK-China Emerging Technologies, 2021, pp. 199-203: IEEE.
  • [14] S. Yuan, Y. Du, M. Liu, S. Yue, B. Li, and H. Zhang, "YOLOv5-Ytiny: A Miniature Aggregate Detection and Classification Model," Electronics, vol. 11, no. 11, pp. 1743-1758, 2022.
  • [15] M. P. Mathew and T. Y. Mahesh, "Leaf-based disease detection in bell pepper plant using YOLO v5," Signal, Image and Video Processing, vol. 16, no. 3, pp. 841-847, 2022.
  • [16] S. Verma, S. Tripathi, A. Singh, M. Ojha, and R. R. Saxena, "Insect Detection and Identification using YOLO Algorithms on Soybean Crop," in TENCON 2021-2021 IEEE Region 10 Conference, 2021, pp. 272-277: IEEE.
  • [17] B. Yan, P. Fan, X. Lei, Z. Liu, and F. Yang, "A real-time apple targets detection method for picking robot based on improved YOLOv5," Remote Sensing, vol. 13, no. 9, pp. 1619-1642, 2021.
  • [18] Y. Hathat, D. Samai, A. Benlamoudi, K. Bensid, and A. Taleb-Ahmed, "SNCF workers detection in the railway environment based on improved YOLO v5," in 7th International Conference on Image and Signal Processing and their Applications, 2022, pp. 1-7: IEEE.
  • [19] M.E. Otgonbold, M. Gochoo, F. Alnajjar, L. Ali, T. H. Tan, J. W. Hsieh, and P. Y. Chen, "SHEL5K: an extended dataset and benchmarking for safety helmet detection," Sensors, vol. 22, no. 6, pp. 2315-2338, 2022.
  • [20] M. Sadiq, S. Masood, and O. Pal, "FD-YOLOv5: A Fuzzy Image Enhancement Based Robust Object Detection Model for Safety Helmet Detection," International Journal of Fuzzy Systems, vol. 24, pp. 2600-2616, 2022.
  • [21] K. Ding, X. Li, W. Guo, and L. Wu, "Improved object detection algorithm for drone-captured dataset based on yolov5," in 2nd International Conference on Consumer Electronics and Computer Engineering, 2022, pp. 895-899: IEEE.
  • [22] L. Li, Z. Yao, Z. Miao, X. Qiu, and X. Yang, "YOLO-A2G: An air-to-ground high-precision object detection algorithm based on YOLOv5," in 6th International Conference on Machine Vision and Information Technology, 2022, pp. 1-8:IOP Publishing.
  • [23] U. Nepal and H. Eslamiat, "Comparing YOLOv3, YOLOv4 and YOLOv5 for autonomous landing spot detection in faulty UAVs," Sensors, vol. 22, no. 2, pp. 464-479, 2022.
  • [24] V. Slyusar, M. Protsenko, A. Chernukha, V. Melkin, O. Biloborodov, M. Samoilenko, O. Kravchenko, H. Kalynychenko, A. Rohovyi, and M. Soloshchuk, "Improving The Model Of Object Detection On Aerial Photographs And Video In Unmanned Aerial Systems," Eastern- European Journal of Enterprise Technologies, vol. 1, no. 9, pp. 115-126, 2022.
  • [25] X. Zhu, S. Lyu, X. Wang, and Q. Zhao, "TPH-YOLOv5: Improved YOLOv5 based on transformer prediction head for object detection on drone-captured scenarios," in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 2778-2788.
  • [26] L. A. J. Abel, T. C. N. Oconer, and J. C. D. Cruz, "Realtime Object Detection of Pantry Objects Using YOLOv5 Transfer Learning in Varying Lighting and Orientation," in 2nd International Conference on Innovative Research in Applied Science, Engineering and Technology, 2022, pp. 1-7: IEEE.
  • [27] S. Li, Y. Li, Y. Li, M. Li, and X. Xu, "YOLO-FIRI: Improved YOLOv5 for Infrared Image Object Detection," IEEE Access, vol. 9, pp. 141861-141875, 2021.
  • [28] N. M. Dipu, S. A. Shohan, and K. A. Salam, "Brain Tumor Detection Using Various Deep Learning Algorithms," in International Conference on Science & Contemporary Technologies, 2021, pp. 1-6: IEEE.
  • [29] J.-H. Kim, N. Kim, Y. W. Park, and C. S. Won, "Object Detection and Classification Based on YOLO-V5 with Improved Maritime Dataset," Journal of Marine Sciences and Engineering, vol. 10, no. 3, pp. 377-391, 2022.
  • [30] W. Wahyono, A. Harjoko, A. Dharmawan, G. Kosala, and P. Y. Pranata, "A Comparison of Deep Learning Methods for Vision-based Fire Detection in Surveillance System," in The 5th International Conference on Future Networks & Distributed Systems, 2021, pp. 1-7.
  • [31] Y. Zhu and W. Q. Yan, "Traffic sign recognition based on deep learning," Multimedia Tools and Applications, vol. 81, no. 13, pp. 17779-17791, 2022.
  • [32] A. S. Hemanth, "Face Mask Detection Using YOLOv5," International Journal of Novel Research Development, vol. 7, no. 5, pp. 390-395, 2022.
  • [33] N. Ottakath, O. Elharrouss, N. Almaadeed, S. A. Maadeed, A. Mohamed, T. Khattab, K. Abualsaud," ViDMASK dataset for face mask detection with social distance measurement," Displays, vol. 73, p. 102235, 2022.
  • [34] A. Bochkovskiy, C. Y. Wang, and H. Y. M. Liao, "Yolov4: Optimal speed and accuracy of object detection," arXiv:2004.10934, 2020.
Year 2023, Volume: 11 Issue: 2, 423 - 438, 01.06.2023
https://doi.org/10.36306/konjes.1228275

Abstract

Project Number

No project number

References

  • [1] X. Chen, X. Huang, Y. Wang, and X. Gao, "Combination of augmented reality based brain- computer interface and computer vision for high-level control of a robotic arm," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 28, no. 12, pp. 3140-3147, 2020.
  • [2] Z. Zhang, Y. Huang, S. Chen, J. Qu, X. Pan, T. Yu, and Y. Li, "An intention-driven semi- autonomous intelligent robotic system for drinking," Frontiers in Neurorobotics, vol. 11, p. 1-14, 2017.
  • [3] S. M. Achari, S. G. Mirji, C. P. Desai, M. S. Hulasogi, and S. P. Awari, "Gesture based wireless control of robotic hand using image processing," International Research Journal of Engineering and Technology, vol. 5, no. 5, pp. 3340-3345, 2018.
  • [4] J. O. P. Arenas, R. J. Moreno, and R. D. H. Beleño, "Convolutional neural network with a dag architecture for control of a robotic arm by means of hand gestures," Contemporary Engineering Sciences, vol. 11, no. 12, pp. 547-557, 2018.
  • [5] P. Atre, S. Bhagat, N. Pooniwala, and P. Shah, "Efficient and feasible gesture controlled robotic arm," in 2018 Second International Conference on Intelligent Computing and Control Systems, 2018, pp. 1-6: IEEE.
  • [6] A. A. Malik and A. J. R. Brem, "Digital twins for collaborative robots: A case study in human- robot interaction," Robotics and Computer-Integrated Manufacturing, vol. 68, pp. 1-16, 2021.
  • [7] J. P. Vasconez, G. A. Kantor, and F. A. A. Cheein, "Human–robot interaction in agriculture: A survey and current challenges," Biosystems engineering, vol. 179, pp. 35-48, 2019.
  • [8] K. Fujii, G. Gras, A. Salerno, and G.-Z. Yang, "Gaze gesture based human robot interaction for laparoscopic surgery," Medical image analysis, vol. 44, pp. 196-214, 2018.
  • [9] M. N. Datta, Y. Rathi, and M. Eliazer, "Wheat heads detection using deep learning algorithms," Annals of the Romanian Society for Cell Biology, vol. 25, no.5, pp. 5641-5654, 2021.
  • [10] F. Jubayer, J. A. Soeb, A. N. Mojumder, M. K. Paul, P. Barua, S. Kayshar, S. S. Akter, M. Rahman, and A. Islam, "Detection of mold on the food surface using YOLOv5," Detection of mold on the food surface using YOLOv5, vol. 4, pp. 724-728, 2021.
  • [11] M. Karakaya, M. F. Celebi, A. E. Gök, and S. Ersoy, "Discovery Of Agricultural Diseases By Deep Learning And Object Detection," Environmental Engineering and Management Journal, vol. 21, no. 1, pp. 163-173, 2022.
  • [12] R. Li and Y. J. E. Wu, "Improved YOLO v5 Wheat Ear Detection Algorithm Based on Attention Mechanism," Electronics, vol. 11, no. 11, pp. 1673-1694, 2022.
  • [13] C. A. Owusu-Agyei and J. Hou, "Hands Activities Detection in Egocentric Interactions Using YOLOv5," in 2021 International Conference on UK-China Emerging Technologies, 2021, pp. 199-203: IEEE.
  • [14] S. Yuan, Y. Du, M. Liu, S. Yue, B. Li, and H. Zhang, "YOLOv5-Ytiny: A Miniature Aggregate Detection and Classification Model," Electronics, vol. 11, no. 11, pp. 1743-1758, 2022.
  • [15] M. P. Mathew and T. Y. Mahesh, "Leaf-based disease detection in bell pepper plant using YOLO v5," Signal, Image and Video Processing, vol. 16, no. 3, pp. 841-847, 2022.
  • [16] S. Verma, S. Tripathi, A. Singh, M. Ojha, and R. R. Saxena, "Insect Detection and Identification using YOLO Algorithms on Soybean Crop," in TENCON 2021-2021 IEEE Region 10 Conference, 2021, pp. 272-277: IEEE.
  • [17] B. Yan, P. Fan, X. Lei, Z. Liu, and F. Yang, "A real-time apple targets detection method for picking robot based on improved YOLOv5," Remote Sensing, vol. 13, no. 9, pp. 1619-1642, 2021.
  • [18] Y. Hathat, D. Samai, A. Benlamoudi, K. Bensid, and A. Taleb-Ahmed, "SNCF workers detection in the railway environment based on improved YOLO v5," in 7th International Conference on Image and Signal Processing and their Applications, 2022, pp. 1-7: IEEE.
  • [19] M.E. Otgonbold, M. Gochoo, F. Alnajjar, L. Ali, T. H. Tan, J. W. Hsieh, and P. Y. Chen, "SHEL5K: an extended dataset and benchmarking for safety helmet detection," Sensors, vol. 22, no. 6, pp. 2315-2338, 2022.
  • [20] M. Sadiq, S. Masood, and O. Pal, "FD-YOLOv5: A Fuzzy Image Enhancement Based Robust Object Detection Model for Safety Helmet Detection," International Journal of Fuzzy Systems, vol. 24, pp. 2600-2616, 2022.
  • [21] K. Ding, X. Li, W. Guo, and L. Wu, "Improved object detection algorithm for drone-captured dataset based on yolov5," in 2nd International Conference on Consumer Electronics and Computer Engineering, 2022, pp. 895-899: IEEE.
  • [22] L. Li, Z. Yao, Z. Miao, X. Qiu, and X. Yang, "YOLO-A2G: An air-to-ground high-precision object detection algorithm based on YOLOv5," in 6th International Conference on Machine Vision and Information Technology, 2022, pp. 1-8:IOP Publishing.
  • [23] U. Nepal and H. Eslamiat, "Comparing YOLOv3, YOLOv4 and YOLOv5 for autonomous landing spot detection in faulty UAVs," Sensors, vol. 22, no. 2, pp. 464-479, 2022.
  • [24] V. Slyusar, M. Protsenko, A. Chernukha, V. Melkin, O. Biloborodov, M. Samoilenko, O. Kravchenko, H. Kalynychenko, A. Rohovyi, and M. Soloshchuk, "Improving The Model Of Object Detection On Aerial Photographs And Video In Unmanned Aerial Systems," Eastern- European Journal of Enterprise Technologies, vol. 1, no. 9, pp. 115-126, 2022.
  • [25] X. Zhu, S. Lyu, X. Wang, and Q. Zhao, "TPH-YOLOv5: Improved YOLOv5 based on transformer prediction head for object detection on drone-captured scenarios," in Proceedings of the IEEE/CVF International Conference on Computer Vision, 2021, pp. 2778-2788.
  • [26] L. A. J. Abel, T. C. N. Oconer, and J. C. D. Cruz, "Realtime Object Detection of Pantry Objects Using YOLOv5 Transfer Learning in Varying Lighting and Orientation," in 2nd International Conference on Innovative Research in Applied Science, Engineering and Technology, 2022, pp. 1-7: IEEE.
  • [27] S. Li, Y. Li, Y. Li, M. Li, and X. Xu, "YOLO-FIRI: Improved YOLOv5 for Infrared Image Object Detection," IEEE Access, vol. 9, pp. 141861-141875, 2021.
  • [28] N. M. Dipu, S. A. Shohan, and K. A. Salam, "Brain Tumor Detection Using Various Deep Learning Algorithms," in International Conference on Science & Contemporary Technologies, 2021, pp. 1-6: IEEE.
  • [29] J.-H. Kim, N. Kim, Y. W. Park, and C. S. Won, "Object Detection and Classification Based on YOLO-V5 with Improved Maritime Dataset," Journal of Marine Sciences and Engineering, vol. 10, no. 3, pp. 377-391, 2022.
  • [30] W. Wahyono, A. Harjoko, A. Dharmawan, G. Kosala, and P. Y. Pranata, "A Comparison of Deep Learning Methods for Vision-based Fire Detection in Surveillance System," in The 5th International Conference on Future Networks & Distributed Systems, 2021, pp. 1-7.
  • [31] Y. Zhu and W. Q. Yan, "Traffic sign recognition based on deep learning," Multimedia Tools and Applications, vol. 81, no. 13, pp. 17779-17791, 2022.
  • [32] A. S. Hemanth, "Face Mask Detection Using YOLOv5," International Journal of Novel Research Development, vol. 7, no. 5, pp. 390-395, 2022.
  • [33] N. Ottakath, O. Elharrouss, N. Almaadeed, S. A. Maadeed, A. Mohamed, T. Khattab, K. Abualsaud," ViDMASK dataset for face mask detection with social distance measurement," Displays, vol. 73, p. 102235, 2022.
  • [34] A. Bochkovskiy, C. Y. Wang, and H. Y. M. Liao, "Yolov4: Optimal speed and accuracy of object detection," arXiv:2004.10934, 2020.
There are 34 citations in total.

Details

Primary Language English
Subjects Engineering
Journal Section Research Article
Authors

Mücahid Barstuğan 0000-0001-9790-5890

Zeynep Osmanpaşaoğlu This is me 0000-0003-1712-3632

Project Number No project number
Publication Date June 1, 2023
Submission Date January 2, 2023
Acceptance Date February 20, 2023
Published in Issue Year 2023 Volume: 11 Issue: 2

Cite

IEEE M. Barstuğan and Z. Osmanpaşaoğlu, “DEEP LEARNING BASED HUMAN ROBOT INTERACTION WITH 5G COMMUNICATION”, KONJES, vol. 11, no. 2, pp. 423–438, 2023, doi: 10.36306/konjes.1228275.

Cited By