Department of Computer Science, Faculty of Physical Sciences, Chukwuemeka Odumegwu Ojukwu University, Uli. Anambra State, Nigeria.
International Journal of Science and Research Archive, 2025, 14(03), 511-521
Article DOI: 10.30574/ijsra.2025.14.3.0693
Received on 01 February 2025; revised on 07 March 2025; accepted on 10 March 2025
This study presents an intelligent robotic object grasping system using computer vision technique and deep reinforcement learning to enhance robotic manipulation. The proposed technique employs You Only Look Once (YOLOv3) for real-time object recognition and localisation, while the Soft Actor-Critic (SAC) system uses depth image information to determine the optimal gripping areas. By transforming the gripping point into a three-dimensional grasping posture, the robotic manipulator can then efficiently choose and arrange objects. The COCO dataset was utilised to increase YOLO's detection accuracy, and transfer learning sped up the training process. The performance evaluation of the proposed system revealed a mean Average Precision (mAP) of 91.2% for item detection and an 87.3% grasping success rate. 10-fold cross-validation verified the model's robustness and generalisability, demonstrating minimal variation in performance across test settings. Compared to traditional gripping approaches, the proposed strategy improved accuracy by 27% and execution efficiency by 35%. These findings demonstrate the YOLO-SAC framework's promise for practical robotic applications by providing a flexible and scalable approach to automated object handling in a range of settings.
Intelligent Robot; Object Grasping; Computer Vision; Reinforcement Learning; Soft Actor-Critic; You Only Look Once
Preview Article PDF
Osita Miracle Nwakeze, Ogochukwu C Okeke and Ike Joseph Mgbemfulike. Intelligent robotic object grasping system using computer vision and deep reinforcement learning techniques. International Journal of Science and Research Archive, 2025, 14(03), 511-521. Article DOI: https://doi.org/10.30574/ijsra.2025.14.3.0693.
Copyright © 2025 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution Liscense 4.0