{"id":8469,"date":"2023-06-30T13:50:34","date_gmt":"2023-06-30T06:50:34","guid":{"rendered":"https:\/\/gulfthai.com\/?p=8469"},"modified":"2023-06-30T14:01:38","modified_gmt":"2023-06-30T07:01:38","slug":"visual-servoing-control-for-robot-arm","status":"publish","type":"post","link":"https:\/\/gulfthai.com\/?p=8469","title":{"rendered":"Visual Servoing Control for Robot Arm"},"content":{"rendered":"\n<p>Mr. Atthaphan Paksakunnee 6230350483<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"410\" height=\"451\" src=\"https:\/\/i0.wp.com\/gulfthai.com\/wp-content\/uploads\/2023\/06\/image-7.jpg?resize=410%2C451&#038;ssl=1\" alt=\"\" class=\"wp-image-8475\"\/><\/figure>\n\n\n\n<p>Mr. Tathipan Chaiwattanapan 6230340038<\/p>\n\n\n\n<figure class=\"wp-block-image size-full\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"368\" height=\"459\" src=\"https:\/\/i0.wp.com\/gulfthai.com\/wp-content\/uploads\/2023\/06\/image-9.jpg?resize=368%2C459&#038;ssl=1\" alt=\"\" class=\"wp-image-8478\"\/><\/figure>\n\n\n\n<p>Mr. Bantoon Saengpairao 6230340054<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-rich is-provider-embed-handler wp-block-embed-embed-handler\"><div class=\"wp-block-embed__wrapper\">\n<div class=\"fb-post\" data-href=\"https:\/\/www.facebook.com\/Dr.Kittipong\/posts\/732531025546608?ref=embed_post\" data-width=\"552\" style=\"background-color: #fff; display: inline-block;\"><\/div>\n<\/div><\/figure>\n\n\n\n<p>03607499 Engineering Project for Robotics and Automation System III<br>This project is part of the Bachelor of Engineering curriculum.<br>Program in Robotic and Automation Systems Engineering (International), Faculty of Engineering, Sriracha Kasetsart University<br>The academic year 2023<br>June 30, 2023<br>Engineering project certificate<br>Robotic and Automation System Engineering (International)<br>Project name Visual servo controller for robot arm<br>By Mr. Bantoon Saengpairao 6230340054<br>Mr. Atthaphan Paksakunnee 6230305089<br>Mr. Tathipan Chaiwattanapan 6230340038<br>Bachelor Degrees Bachelor of Engineering<br>Major Robotic and Automation System Engineering (International)<br>B.E. 2023<br>Project Advisors asst. Prof. Dr. Kittipong Yaovaja<\/p>\n\n\n\n<p>Faculty of Engineering Sriracha Kasetsart University This project was approved as part of the Bachelor of Engineering degree program.<br>\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026\u2026.. Advisors<br>Asst. Prof. Dr. Kittipong Yaovaja<\/p>\n\n\n\n<p>\u2003<br>Acknowledgment<br>This project was completed thanks to the contributions of many benefactors. The project team would like to express their gratitude to the faculty members who provided advice, procured various equipment, and helped troubleshoot any defects that arose, which made it possible for the project to be completed successfully.<br>We would like to thank Kasetsart University Faculty of Engineering Sriracha Campus and all personnel who kindly provided the location and facilities necessary for the project to be carried out smoothly and successfully.<br>Finally, we would like to express our deepest gratitude to our parents, relatives, and all those who have been a constant source of encouragement. We would also like to extend our thanks to fellow students and all those who provided assistance and advice throughout the project.<br>Regards<br>Bantoon Saengpairao<br>Atthaphan Paksakunnee<br>Tathipan Chaiwattanapan<\/p>\n\n\n\n<p>Project name Visual servo controller for robot arm<br>By Mr. Bantoon Saengpairao 6230340054<br>Mr. Atthaphan Paksakunnee 6230305089<br>Mr. Tathipan Chaiwattanapan 6230340038<br>Bachelor Degrees Bachelor of Engineering<br>Major Robotic and Automation System Engineering (International)<br>B.E. 2022<br>Project Advisors asst. Prof. Dr. Kittipong Yaovaja<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"507\" height=\"380\" src=\"https:\/\/i0.wp.com\/gulfthai.com\/wp-content\/uploads\/2023\/06\/image-10.jpg?resize=507%2C380&#038;ssl=1\" alt=\"\" class=\"wp-image-8483\"\/><\/figure><\/div>\n\n\n<p><br>Abstract<br>In this project, we developed a visual servoing controller for a robotic arm. The goal was to create a controller that would allow the robot arm to accurately track a moving target for the robotic arm using visual feedback. The controller was designed to work in real-time, using images captured by a camera mounted on the robot arm.<br>The visual servo controller, which consisted of a feature extractor using computer vision techniques to detect and track the target in the camera image, and a control system using MOVEIT packaged in ROS to adjust the robot arm&#8217;s position in real-time based on the target&#8217;s location, was evaluated for performance through a series of experiments in a simulated environment using ABB IRB120 robot to communicate with the ROS program.<br>Overall, the results demonstrate the feasibility and effectiveness of using visual servo control for robotic arm tracking. This technology has potential applications in a range of fields, such as manufacturing, logistics, and medical robotics, where precise and efficient tracking of moving targets is important. Further research is needed to optimize the controller&#8217;s performance and explore its potential applications.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"586\" height=\"273\" src=\"https:\/\/i0.wp.com\/gulfthai.com\/wp-content\/uploads\/2023\/06\/image-11.jpg?resize=586%2C273&#038;ssl=1\" alt=\"\" class=\"wp-image-8485\"\/><\/figure><\/div>\n\n\n<p>TABLE OF CONTENTS<br>CONTENTS<br>INTRODUCTION 13<br>1.1 Introduction 13<br>1.2 Objective 13<br>1.3 Scope of the project 14<br>1.4 Assigned duties 14<br>1.5 Research Schedule and Detailed Activity 15<br>1.6 Expected Benefits 16<br>CHAPTER 2 17<br>Theories 17<br>2.1 Ubuntu 17<br>2.1.1 Ubuntu 20.04 version 18<br>2.2 ROS 20<br>2.2.1 ROS Noetic Ninjemys 21<br>2.2.2 Basic ROS 23<br>2.2.3 MoveIt \/RViz 24<br>2.3 RealSense D435i depth camera 26<br>2.3.1 RealSense SDK 2.0 27<br>2.4 Aruco marker 28<br>2.4.1 Marker Detection 29<br>2.4.2 Pose Estimation 31<br>2.5 Robot industrial 32<br>2.5.1 ABB 32<br>2.6 Robotic eye-in-hand 35<br>CHAPTER 3 37<br>Materials and Methods 37<br>3.1 VirtualBox 37<br>3.1.1 Install and setup Ubuntu 38<br>3.2 ROS 42<br>3.2.1 Install and setup ROS 42<br>3.3 ROS with ABB 45<br>3.3.1 Install and setup Package Summary 45<br>3.3.2 Simulation ABB with RViz and Gazebo 49<br>3.3.3 Simulation ABB with RViz and Robot Studio 52<br>3.3.4 Simulation ABB with RViz and Real robot IRB120 66<br>3.4 RealSense D435i depth camera 76<br>3.5 Aruco maker detected. 79<br>3.6 Visual servoing 83<br>3.8 Collision Safety. 88<br>3.9 Combine program. 93<br>CHAPTER 4 95<br>Results 95<br>4.1 Flow chart 95<br>4.2 Results Visual servoing 96<br>CHAPTER 5 98<br>Conclusion 98<br>5.1 Performance Summary 98<br>5.2 Problems and suggestions 99<br>References Error! Bookmark not defined.<\/p>\n\n\n\n<p>LIST OF TABLES<br>Table 1 Create Tasks 58<br>Table 2 Create Signals 59<br>Table 3 Tie Signals to the System Outputs 60<br>Table 4 Load Modules to Tasks 61<\/p>\n\n\n\n<p>LIST OF FIGURES<br>Figure 1 Ubuntu logo 17<br>Figure 2 Ubuntu website 17<br>Figure 3 Ubuntu 20.04 version 18<br>Figure 4 Ubuntu 20.04.6 LTS 19<br>Figure 5 ROS logo 20<br>Figure 6 ROS website 20<br>Figure 7 ROS Community 21<br>Figure 8 ROS Noetic Ninjemys 22<br>Figure 9 ROS Noetic installation 22<br>Figure 10 MoveIt logo 24<br>Figure 11 RViz interface 25<br>Figure 12 Transform tree 26<br>Figure 13 RealSense D435i depth camera 26<br>Figure 14 RealSense D435i depth camera structured 27<br>Figure 15 Example of Aruco markers 28<br>Figure 16 Aruco detection 30<br>Figure 17 Aruco pose estimate 31<br>Figure 18 robot industrial 32<br>Figure 19 ABB logo 33<br>Figure 20 ABB IRB120 33<br>Figure 21 Robotic eye-in-hand 35<br>Figure 22 VirtualBox logo 37<br>Figure 23 VirtualBox 7.0.8 platform packages download Windows hosts 37<br>Figure 24 Oracle VM VirtualBox Manager 38<br>Figure 25 Create VM 38<br>Figure 26 Choose the base memory VM 39<br>Figure 27 Virtual Hard Disk Size 39<br>Figure 28 Virtual summary 40<br>Figure 29 Setting VM 40<br>Figure 30 Storage Ubuntu 41<br>Figure 31 Click Start Ubuntu 41<br>Figure 32 Ubuntu 20.04 Linux 42<br>Figure 33 check rosversion 44<br>Figure 34 catkin workspace 45<br>Figure 35 ROS-Industrial Overview 46<br>Figure 36 ABB Experimental files 49<br>Figure 37 ABB IRB 120 manipulator flies 49<br>Figure 38 ABB with RViz and Gazebo 50<br>Figure 39 OMPL RRTConnectkConfigDefault. 50<br>Figure 40 Planning box 51<br>Figure 41 RobotStudio IRB120 52<br>Figure 42 Add controller Robotstudio 52<br>Figure 43 Controller options 53<br>Figure 44 Copy the files ROS to folder Robot studio create 53<br>Figure 45 Network Connection windows 54<br>Figure 46 Ipconfig command prompt 55<br>Figure 47 setting Network VM 55<br>Figure 48 ROS_socket.sys RAPID 56<br>Figure 49 Open the FlexPendant 57<br>Figure 50 Flexpendant running in MANUAL 57<br>Figure 51 Create Tasks on Flexpendant 58<br>Figure 52 Create Tasks on Robotstudio 58<br>Figure 53 Create Signals on Flexpendant 59<br>Figure 54 Create Signals on Robotstudio 59<br>Figure 55 Signals to the System Outputs on flexpendant 60<br>Figure 56 Signals to the System Outputson Robotstudio 61<br>Figure 57 Load Modules to Tasks on Flexpendent 62<br>Figure 58 Load Modules to Tasks on Robotstudio 62<br>Figure 59 Editor select Tasks and programs set modules 62<br>Figure 60 AUTO mode Flexpendent 63<br>Figure 61 Flexpendent waiting for connection 63<br>Figure 62 Ping 192.168.56.1 on Ubuntu 64<br>Figure 63 abb_irb120_moveit 64<br>Figure 64 Flexpendent connection with ROS 65<br>Figure 65 robot studio and Moveit 65<br>Figure 66 robot studio and Moveit comunnication 66<br>Figure 67 IRC5 controller 66<br>Figure 68 IP sever socket \u201c192.168.125.1\u201d 67<br>Figure 69 Ethernet Status. IP 192.168.125.1 67<br>Figure 70 IP 192.168.125.2 Subnet mask 255.255.255.0 68<br>Figure 71 Add controller\u201d to connect IP 68<br>Figure 72 IRC5 controller port connected. 68<br>Figure 73 Request write access Flexpendant 69<br>Figure 74 Create Relation 69<br>Figure 75 Create Relation type 69<br>Figure 76 Transfer all data 70<br>Figure 77 Transfer Summary 70<br>Figure 78 Reset RAPID (P-start). 71<br>Figure 79 FlexPendant the notification connection waiting. 71<br>Figure 80 Network setting on Ubuntu 72<br>Figure 81 Set the IPv4 Method 72<br>Figure 82 Ubuntu turn on\/off connected. 72<br>Figure 83 Ethernet connected Ubuntu 73<br>Figure 84 Flexpendent connection 74<br>Figure 85 the IRC5 controller auto mode 74<br>Figure 86 Data points to Flexpendent 75<br>Figure 87 Robot Plan &amp; Execute the position axis. 75<br>Figure 88 USB connect the camera 76<br>Figure 89 D435i list of topics 78<br>Figure 90 Aruco marker detection program 82<br>Figure 91 Collision Safety. 88<br>Figure 92 collision object warming 92<br>Figure 93 Flow chart Visual servoing 95<br>Figure 94 Visual servoing working 96<br>Figure 95 Visual servoing working in program. 96<br>Figure 96 Visual servoing with program 97<\/p>\n\n\n\n<p>CHAPTER 1<br>INTRODUCTION<br>1.1 Introduction<br>Robotics has become an important field of study with its vast applications in various industries. In recent years, there has been growing interest in developing visual servoing controllers for robotic arms to enhance their ability to interact with and manipulate objects in the environment. One approach to achieving this goal is to use RViz in ROS, which is a popular tool for robot visualization and control.<br>In this report, we present the design and implementation of a visual servo controller for a robotic arm controlled by RViz on ROS. The controller was developed using computer vision techniques to detect and track a moving target in real-time, based on images captured by a camera mounted on the robot arm. The control system used RViz in ROS to adjust the robot arm&#8217;s position in real-time based on the target&#8217;s location in the image.<br>The report will detail the development process of the visual servoing controller, including the feature extractor and control system components. We will also present the results of experiments conducted to evaluate the performance of the controller, using a simulated environment ABB robots to test the RViz in ROS program.<br>1.2 Objective<br>1 To develop a visual servoing controller for robotic arms using RViz in ROS; suitable for robots in the industry such as ABB.<br>2 To develop visual servoing controller will utilize computer vision techniques to detect and track a moving target in real-time.<br>3 To evaluate performance of the visual servoing controller through experiments in a simulated environment and explore its potential applications in ROS Noetic.<\/p>\n\n\n\n<p>1.3 Scope of the project<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>To develop and test a visual servoing controller for a robotic arm using RViz with ROS Noetic in Ubuntu 20.04.<\/li>\n\n\n\n<li>The controller will be designed to track a moving target in real-time using a RealSense D435i depth camera.<\/li>\n\n\n\n<li>The project will focus on evaluating ArUco marker detection performance in a simulated environment and exploring ROS potential applications in ROS Noetic.<\/li>\n<\/ol>\n\n\n\n<p>1.4 Assigned duties<br>Mr. Tathipan Chaiwattanapan<br>\u2022 Setting all Configure and transfer data to IRC5 controller.<br>\u2022 Setting all IP Network for communicate between Robot ABB IRB120 with ROS.<br>\u2022 Test the programs simulation Robot on Gazebo, Robotstudio and Real robot IRB120.<br>Mr. Atthaphan Paksakunnee:<br>\u2022 Make ArUco markers using the RealSense D435i depth camera.<br>\u2022 Create a camera axis attached to the robot and retrieve the position values (XYZ) of the detected marker.<br>\u2022 Combine the robot code and camera code to work together.<\/p>\n\n\n\n<p>1.5 Research Schedule and Detailed Activity<br>Table 1 Shows the workflow process for each mouth.<br>Topic January<br>1 month February<br>2 months March<br>3 months April<br>4 months<br>Find information about a program that can create a visual servoing controller for a robot arm.<br>Find and install the necessary study materials on Ubuntu and ROS systems<br>Begin learning and experimenting with the Dynamixel robotic arm and programming RViz to control its movement.<br>Study and experiment with both the NUBWO camera system with ROS, as well as the RealSense depth camera D435i system.<br>Learn and experiment with the AUBO robot and ROS.<br>Shift focus to studying the ABB IRB120 robot with ROS.<br>Learn how to connect the IP address from the IRC5 controller to Ubuntu.<br>Successfully detect ArUco markers using the RealSense depth camera D435i.<br>Create a camera axis attached to the robot body in the RViz program and read the position (XYZ) of the detected marker.<br>Use RViz to control a real ABB IRB120 robot through ROS.<br>Combine the robot code and camera code to create a visual servoing controller for the robot arm.<\/p>\n\n\n\n<p>1.6 Expected Benefits<\/p>\n\n\n\n<ol class=\"wp-block-list\">\n<li>The visual servoing controller developed using ROS and tested with ABB robots is expected to enhance robotic arm movements in diverse industries.<\/li>\n\n\n\n<li>Real-time tracking using a depth camera for the visual servoing controller helps robots respond efficiently to environment changes, improving performance and productivity.<\/li>\n\n\n\n<li>Local IP address connectivity, Integration with robot systems and devices<\/li>\n<\/ol>\n\n\n\n<p>CHAPTER 2<br>Theories<br>In Chapter 2, the theory of access of Robot industrial and software program such as Ubuntu, ROS, Depth camera.<\/p>\n\n\n\n<p>2.1 Ubuntu<\/p>\n\n\n\n<p>Figure 1 Ubuntu logo<br>Ubuntu is a widely-used open-source operating system that is based on the Linux kernel. It was initially released in 2004 and has since gained popularity among users for its ease of use, stability, and security. The name &#8220;Ubuntu&#8221; is derived from a South African philosophy that emphasizes the interconnectedness of all things. One of the key features of Ubuntu is its package management system, which allows users to easily install, update, and remove software from their systems. Ubuntu also includes a graphical user interface (GUI) that is user-friendly and customizable.<\/p>\n\n\n\n<p>Figure 2 Ubuntu website<br>Ubuntu is popular among developers, system administrators, and other technical users due to its command-line interface (CLI) and support for a wide range of programming languages and tools. Additionally, Ubuntu is used in many enterprise environments due to its stability and security features, including mandatory access control and full disk encryption. Ubuntu is constantly updated and improved by a large community of developers and users, and new versions are released every six months. These updates often include new features, security patches, and bug fixes, ensuring that Ubuntu remains a reliable and secure operating system for its users.<br>In conclusion, Ubuntu is a powerful and versatile operating system that is widely used in a variety of settings. Its ease of use, stability, and security make it an excellent choice for both personal and professional use, and its open-source nature ensures that it will continue to be updated and improved by its community of users and developers.<\/p>\n\n\n\n<p>2.1.1 Ubuntu 20.04 version<\/p>\n\n\n\n<p>Figure 3 Ubuntu 20.04 version<br>Ubuntu 20.04 is a Linux distribution that was released on April 23, 2020. It is the latest long-term support (LTS) version of the Ubuntu operating system and is expected to receive support and updates until 2025. Ubuntu is one of the most popular Linux distributions, known for its ease of use, stability, and community support.<\/p>\n\n\n\n<p>Finally, Ubuntu 20.04 includes many new features and improvements for developers. The OS ships with the latest versions of popular programming languages, such as Python 3.8 and PHP 7.4. The new Ubuntu 20.04 also includes a built-in toolchain for developing and debugging applications for the Snap craft package format. This makes it easier for developers to create and distribute their applications on Ubuntu.<\/p>\n\n\n\n<p>Figure 4 Ubuntu 20.04.6 LTS<br>In conclusion, Ubuntu 20.04 is a significant update to the popular Linux distribution, offering improved performance, stability, and features. Its updated user interface, new kernel, and updated applications make it an excellent choice for both desktop and server use. Additionally, the OS provides a great environment for developers with the latest tools and features needed for creating and distributing applications.<\/p>\n\n\n\n<p>2.2 ROS<\/p>\n\n\n\n<p>Figure 5 ROS logo<br>ROS, which stands for Robot Operating System, is an open-source software framework for building robotic systems. It was first developed by Willow Garage in 2007 and has since become the standard for building robotic systems across a wide range of applications, including autonomous vehicles, drones, industrial automation, and more. At its core, ROS is a middleware that provides a set of libraries and tools that allow robots to communicate with each other and with their environments. It provides a set of standard interfaces for sensors, actuators, and other hardware components, as well as a set of tools for processing and analyzing sensor data, controlling robot motion, and managing system-level tasks.<\/p>\n\n\n\n<p>Figure 6 ROS website<br>One of the key benefits of ROS is its modular architecture, which allows developers to build complex robotic systems by combining and reusing existing components. This modularity makes it easy to build and test individual components in isolation, simplifies system integration, and allows for easy sharing and collaboration among developers. ROS also includes a powerful visualization system that allows developers to visualize and interact with the data generated by their robots in real-time. This includes 2D and 3D visualization tools, as well as tools for debugging and tuning robot behavior.<\/p>\n\n\n\n<p>Figure 7 ROS Community<br>Another key feature of ROS is its support for distributed computing. ROS provides a set of tools for running robotic systems across multiple computers, allowing developers to take advantage of the processing power of multiple machines and distribute the workload of complex applications. Finally, ROS has a large and active community of developers and users, who contribute to the development and improvement of the framework and share their knowledge and expertise with others. This community provides a wealth of resources, including documentation, tutorials, and support forums, making it easier for developers to get started with ROS and to overcome any challenges they may encounter.<\/p>\n\n\n\n<p>2.2.1 ROS Noetic Ninjemys<br>ROS Noetic Ninjemys is the latest stable version of the Robot Operating System (ROS) framework, released in May 2020. It is the successor to ROS Melodic Morenia and is named after the North American freshwater turtle species, the Noetic turtles. One of the most significant updates in ROS Noetic is its support for Python 3. While previous versions of ROS were based on Python 2, ROS Noetic is built using Python 3. This update provides improved performance and support for modern Python libraries, as well as the latest features and tools provided by the Python language.<\/p>\n\n\n\n<p>Figure 8 ROS Noetic Ninjemys<br>Another major update in ROS Noetic is its support for the latest versions of the Gazebo simulator and the MoveIt! motion planning framework. These updates provide developers with a more powerful and flexible simulation environment for testing and developing robotic systems, as well as improved motion planning and control capabilities. ROS Noetic also includes updates and improvements to several core packages, including the rosbag package for recording and playing back ROS messages, the RViz visualization tool, and the navigation stack for autonomous navigation. These updates provide improved performance, stability, and features to these core components, making it easier for developers to build complex robotic systems.<\/p>\n\n\n\n<p>Figure 9 ROS Noetic installation<br>Additionally, ROS Noetic includes a set of new packages and libraries that provide support for a range of new hardware and sensors, such as the Intel RealSense camera and the Velodyne LiDAR sensor. These new packages allow developers to take advantage of the latest hardware and sensors in their robotic systems, opening up new possibilities for applications such as autonomous driving and robotics research. Overall, ROS Noetic Ninjemys offers several updates and improvements to the ROS framework, including support for Python 3, updated versions of core packages and tools, and new packages for hardware and sensors. These updates make ROS Noetic a powerful and flexible tool for building complex robotic systems, with a modular architecture, powerful visualization tools, support for distributed computing, and a large and active community of developers and users.<\/p>\n\n\n\n<p>2.2.2 Basic ROS<br>ROS node is a process that performs a specific task and communicates with other nodes via messages. Nodes are the building blocks of ROS applications, and they can be written in various programming languages such as C++, Python, and Java. Nodes can publish messages to topics or subscribe to topics to receive messages from other nodes. They can also provide and use services to perform specific tasks. Nodes in ROS can run on a single machine or distributed across multiple machines, providing a flexible and scalable framework for building robotic systems.<br>ROS topic is a named bus over which nodes exchange messages. Topics enable communication between nodes in a publish-subscribe architecture, where nodes can publish messages to a topic or subscribe to receive messages from a topic. Messages can be of any type, such as sensor data, control commands, or status updates. Topics can also be visualized using tools such as rostopic and rqt_graph, making it easier to understand the communication between nodes in a ROS system.<br>ROS core is system provides a set of tools and libraries that enable communication between nodes, as well as support for common functionality such as message passing, visualization, and hardware drivers. The core system includes components such as the roscore, roslaunch, rostopic, and RViz, which are essential for building ROS applications. The core system is designed to be modular and extensible, allowing developers to add their own packages and libraries to build custom functionality on top of the core system.<br>ROS catkin workspace is a directory where ROS packages are built and installed. Catkin is the build system used in ROS, and it provides a unified way to build, test, and install ROS packages. A catkin workspace contains a src directory where the source code for ROS packages is stored, as well as a devel directory where the built packages are installed. Catkin workspaces enable developers to manage dependencies between packages and build custom ROS distributions. They also provide tools for managing and releasing packages, such as catkin_make, catkin_tools, and bloom. Overall, catkin workspaces are an essential tool for building and managing ROS applications.<\/p>\n\n\n\n<p>2.2.3 MoveIt \/RViz<br>In ROS, MoveIt! is a powerful motion planning framework that provides tools for planning, executing, and monitoring robotic motion. MoveIt! is designed to work with a wide range of robot platforms, making it a popular choice for building complex robotic systems. One of the key features of MoveIt! is its integration with RViz, a 3D visualization tool for ROS. RViz provides a graphical user interface for visualizing and interacting with robot models, planning trajectories, and monitoring the execution of motion plans. MoveIt! and RViz work together to provide a powerful and flexible framework for motion planning and control in ROS, enabling developers to build sophisticated robotic applications with ease.<\/p>\n\n\n\n<p>Figure 10 MoveIt logo<br>Motion planning refers to the process of generating a sequence of movements for a robot to achieve a specific task or goal. This can include tasks such as navigating through an environment, manipulating objects, or performing a series of coordinated motions. ROS provides several motion planning frameworks, including MoveIt!, which provides a set of tools for planning, executing, and monitoring robotic motion. MoveIt! uses a variety of motion planning algorithms to generate feasible and optimized motion plans for robot systems. These algorithms include geometric planning, sampling-based planning, and optimization-based planning. ROS also provides tools for simulating robot motion, enabling developers to test and refine their motion plans before deploying them on a physical robot. Overall, ROS provides a powerful and flexible framework for motion planning in robotic systems, enabling developers to build sophisticated applications with ease.<\/p>\n\n\n\n<p>Figure 11 RViz interface<br>Also MoveIt have TF (short for Transform) function is a powerful library for managing coordinate frame transforms in 3D space. TF provides a way to define a hierarchy of coordinate frames that represent the position and orientation of objects in a robot system. These frames can represent physical components of the robot, such as joints and sensors, as well as objects in the robot&#8217;s environment, such as tables and walls. TF provides tools for transforming points and vectors between coordinate frames, allowing developers to easily track the position and orientation of objects in a robot system. TF also provides tools for interpolating between frames, smoothing noisy sensor data, and managing multiple frames of reference. Overall, TF is an essential tool for managing coordinate frames in ROS, enabling developers to build complex robotic systems with ease.<\/p>\n\n\n\n<p>Figure 12 Transform tree<\/p>\n\n\n\n<p>2.3 RealSense D435i depth camera<\/p>\n\n\n\n<p>Figure 13 RealSense D435i depth camera<br>The RealSense D435i is a depth camera developed by Intel. It is part of the Intel RealSense product line, which includes various cameras and depth sensing technologies.<br>The D435i is an upgraded version of the D435 camera and incorporates additional inertial measurement unit (IMU) sensors. These IMU sensors enable the camera to capture not only depth information but also data related to motion and orientation. This combination of depth sensing and inertial data allows for more advanced applications in robotics, augmented reality, virtual reality, and other fields that require precise spatial understanding.<\/p>\n\n\n\n<p>Figure 14 RealSense D435i depth camera structured<br>The D435i camera utilizes structured light technology to capture depth information. It emits a pattern of infrared light and then analyzes the deformation of this pattern when it interacts with objects in the environment. By measuring the distortion of the pattern, the camera can calculate the depth values of different points in the scene.<br>The D435i camera also features stereo cameras for capturing RGB (color) information. These RGB images can be combined with the depth data to create a complete 3D representation of the environment.<\/p>\n\n\n\n<p>2.3.1 RealSense SDK 2.0<br>The RealSense SDK 2.0 (Software Development Kit) is a software package developed by Intel Corporation. It is designed to enable developers to incorporate Intel RealSense technology into their applications and projects. Intel RealSense technology includes a combination of depth sensing, motion tracking, and image processing capabilities.<br>The RealSense SDK 2.0 is a cross-platform library for Intel\u00ae RealSense&#x2122; depth cameras (D400 &amp; L500 series and the SR300) and the T265 tracking camera.<br>The RealSense SDK 2.0 provides APIs (Application Programming Interfaces) and tools for accessing and utilizing the various features of Intel RealSense depth cameras. These depth cameras capture depth information along with color and infrared data, allowing for the creation of immersive augmented reality experiences, gesture recognition, 3D scanning, facial analysis, and more.<br>The RealSense SDK 2.0 supports multiple programming languages, including C++, C#, Python, and Java, making it accessible to a wide range of developers. It provides a range of functionalities and features, such as depth stream access, hand and finger tracking, face tracking and recognition, object scanning, and background segmentation.<\/p>\n\n\n\n<p>2.4 Aruco marker<br>Aruco markers are a type of fiducial marker used in computer vision applications, particularly in augmented reality (AR) and robotics. They are square or rectangular markers that consist of a black and white grid pattern. The patterns are designed to be easily detectable and identifiable by computer vision algorithms.<br>The term &#8220;Aruco&#8221; is derived from the words &#8220;Augmented&#8221; and &#8220;Reality&#8221; combined with the initials &#8220;Co&#8221; from &#8220;Computer Vision.&#8221; Aruco markers are widely used for camera pose estimation, object tracking, and localization in AR applications. They provide a way for a computer vision system to recognize and track the position and orientation of the marker in real-time.<br>This is a random selection of Aruco markers, the kind of markers which we shall endeavor to detect in images:<\/p>\n\n\n\n<p>Figure 15 Example of Aruco markers<br>An Aruco marker refers to a synthetic square marker characterized by a prominent black border surrounding an inner binary matrix, which serves as its identifier (id). The black border serves the purpose of enabling swift detection within an image, while the binary encoding enables identification and facilitates the implementation of error detection and correction techniques. The dimensions of the marker correspond to the size of the internal matrix. For example, a marker size of 4&#215;4 consists of 16 bits.<br>It is important to highlight that a marker can be detected in various orientations within the environment. However, the detection process must possess the ability to determine the marker&#8217;s original rotation, ensuring the unambiguous identification of each corner. This rotation determination is also achieved based on the marker&#8217;s binary codification.<\/p>\n\n\n\n<p>In a specific application, a dictionary of markers comprises the collection of markers that are utilized. Essentially, it is a straightforward compilation of the binary codifications associated with each individual marker in the set.<br>The main properties of a dictionary are the dictionary size and the marker size.<br>\u2022 The dictionary size is the number of markers that compose the dictionary.<br>\u2022 The marker size is the size of those markers (the number of bits).<br>The Aruco module includes some predefined dictionaries covering a range of different dictionary sizes and marker sizes.<br>It may be initially assumed that the marker identifier corresponds to a decimal number obtained by converting the binary codification. However, this approach becomes impractical for markers with high sizes where the number of bits becomes excessively large. Managing such huge numbers becomes cumbersome and inefficient. Instead, the marker identifier is simply defined as the index of the marker within the dictionary to which it belongs.<br>2.4.1 Marker Detection<br>Given an image containing Aruco markers, the detection process has to return a list of detected markers. Each detected marker includes:<br>\u2022 The position in the image<br>\u2022 The id of the marker.<br>The process of detecting markers involves two primary steps:<br>I. Detection of marker candidates: In this step, the image is analyzed to identify square-shaped objects that have the potential to be markers. The process begins with adaptive thresholding to segment the markers, followed by the extraction of contours from the thresholded image. Contours that are not convex or do not approximate a square shape are discarded. Additional filtering techniques are applied, such as removing contours that are too small or too large, or eliminating contours that are in close proximity to each other.<br>II. Verification of marker candidates: After the initial detection of marker candidates, it is necessary to determine if they are indeed markers by analyzing their inner codification. This step involves extracting the marker bits for each candidate. To achieve this, a perspective transformation is applied to bring the marker into its canonical form. Subsequently, the canonical image is thresholded using Otsu&#8217;s method to separate the black and white bits. The image is divided into different cells based on the marker size and border size. The number of black or white pixels in each cell is then counted to determine whether it represents a white or black bit. Finally, the bits are analyzed to ascertain whether the marker belongs to the specific dictionary. Error correction techniques are employed as necessary to enhance the accuracy of identification.<\/p>\n\n\n\n<p>These are the detected markers (in green). Note that some markers are rotated. The small red square indicates the marker\u2019s top left corner.:<\/p>\n\n\n\n<p>Figure 16 Aruco detection<br>2.4.2 Pose Estimation<br>To accomplish camera, pose estimation, it is imperative to possess knowledge of the camera&#8217;s calibration parameters. These parameters consist of the camera matrix and distortion coefficients.<br>\u2022 Camera Matrix: The camera matrix contains intrinsic parameters that define the internal properties of the camera. It includes focal length (expressed in pixels), principal point coordinates (representing the optical center of the camera), and skew coefficient (typically assumed to be zero). The camera matrix encapsulates the transformation from 3D world coordinates to 2D image coordinates.<br>\u2022 Distortion Coefficients: Distortion coefficients account for lens distortions that occur in real-world cameras. These distortions can be classified into radial distortion and tangential distortion. Radial distortion refers to the curvature of straight lines near the image corners, while tangential distortion accounts for the slight tilt or shift of the lens relative to the image plane.<\/p>\n\n\n\n<p>The function assumes a marker coordinate system where the origin is positioned either in the center (by default) or at the top left corner of the marker. In this coordinate system, the Z-axis extends outward from the marker surface. The axis-color correspondences follow the convention of X-axis represented by red, Y-axis represented by green, and Z-axis represented by blue.<br>It&#8217;s important to note that the image provided demonstrates the axis directions for rotated markers. These directions indicate how the X, Y, and Z axes align with the marker&#8217;s orientation in 3D space.<\/p>\n\n\n\n<p>Figure 17 Aruco pose estimate<br>2.5 Robot industrial<br>The industrial robots defined by the ISO standard are programmable manipulators with three or more axes that are automatically controlled, reprogrammable, and multipurpose. They are used to automate processes in the industrial sector and can operate within collaborative environments with humans or within a security fence. Industrial robots typically have between three and seven axes and different degrees of freedom, making them suitable for various applications, including assembly lines and production lines. Overall the use of industrial robots in the manufacturing industry has become increasingly prevalent and will likely continue to grow as technology advances.<\/p>\n\n\n\n<p>Figure 18 robot industrial<br>However, there are also potential drawbacks to consider when using AUBO cobots. One is that they may not be suitable for all types of tasks, particularly those that require a high degree of precision or dexterity. Additionally, the upfront cost of acquiring and integrating AUBO cobots into existing production processes can be a significant investment for companies. Nevertheless, the potential benefits of using AUBO cobots suggest that they are a promising technology for the future of industrial automation.<br>2.5.1 ABB<br>ABB robots are industrial robots manufactured by ABB for various applications such as welding, painting, material handling, and assembly. This report will explore the benefits and potential drawbacks of ABB robots, including improved production efficiency and reduced labor costs, and the potential for job displacement. The theories discussed will highlight the impact of ABB robots on industrial automation and the future of work.<\/p>\n\n\n\n<p>Figure 19 ABB logo<br>The ABB IRB120 robot is a compact and versatile industrial robot designed for various applications, including assembly, material handling, and machine tending. One theory regarding the robot is that it can improve production efficiency in small and medium-sized enterprises (SMEs). Due to its compact size and flexible programming, the IRB120 robot can be easily integrated into existing production processes in SMEs, increasing productivity and reducing lead times. This can be particularly beneficial for SMEs that are looking to compete with larger companies by adopting advanced manufacturing technologies.<\/p>\n\n\n\n<p>Figure 20 ABB IRB120<br>Despite the many benefits of using the ABB IRB120 robot, there are potential drawbacks to consider. One concern is the high initial investment cost associated with the robot, as well as the need for specialized training and programming. Additionally, the implementation of the robot may require changes to the existing production processes, which can be a complex and time-consuming process.<br>The maximum working range of the ABB IRB 120 can be expressed in centimeters as follows:<br>\u2022 Base rotation: \u00b1180 degrees<br>The maximum distance the robot can reach in this axis is dependent on the location of the robot&#8217;s base and the length of its arm.<br>\u2022 Shoulder rotation: +155 to -90 degrees<br>The maximum vertical reach of the robot in this axis is approximately 42.5 cm.<br>\u2022 Elbow rotation: +45 to -90 degrees<br>The maximum horizontal reach of the robot in this axis is approximately 40.8 cm.<br>\u2022 Wrist rotation: \u00b1270 degrees<br>The maximum horizontal reach of the robot in this axis is approximately 31.7 cm.<br>\u2022 Wrist bend: \u00b1135 degrees<br>The maximum vertical reach of the robot in this axis is approximately 27.5 cm.<br>\u2022 Gripper rotation: \u00b1360 degrees<br>The maximum distance the robot can reach in this axis is dependent on the length of the gripper.<br>To convert the maximum working range from degrees to centimeters, we need to know the length of the robot&#8217;s arm. Assuming the arm length is approximately 51.5 cm, we can estimate the maximum working range of each axis in centimeters as follows:<br>\u2022 Base rotation: \u00b151.5 cm (based on the length of the arm)<br>\u2022 Shoulder rotation: 0 to 42.5 cm (vertical reach)<br>\u2022 Elbow rotation: 0 to 40.8 cm (horizontal reach)<br>\u2022 Wrist rotation: \u00b131.7 cm (horizontal reach)<br>\u2022 Wrist bend: 0 to 27.5 cm (vertical reach)<br>\u2022 Gripper rotation: Dependent on the length of the gripper<br>It&#8217;s important to note that the actual maximum working range of the robot may be limited by other factors such as the size and shape of the objects being handled, the orientation of the gripper, and any obstacles in the environment. Therefore, it&#8217;s essential to perform a thorough analysis of the robot&#8217;s reach and workspace for each specific application to ensure that the robot can perform the required tasks accurately and safely.<br>2.6 Robotic eye-in-hand<br>Robotic eye-in-hand visual servoing is an important area of research in robotics and automation. It is a closed-loop control technique that uses visual feedback to adjust the robot&#8217;s motion. This technique involves using a camera mounted on the robot&#8217;s end effector to capture images of the object or scene being manipulated, and then using computer vision algorithms to extract information from the images to update the robot&#8217;s position and orientation in real-time. In this report, we will discuss the theoretical foundations and key concepts of robotic eye-in-hand visual servoing.<\/p>\n\n\n\n<p>Figure 21 Robotic eye-in-hand<br>Robotic eye-in-hand visual servoing is a powerful technique that allows robots to adapt to changes in the environment and perform complex tasks with high accuracy and efficiency. The theoretical foundations and key concepts of visual servoing, eye-in-hand visual servoing, and robotics eye-in-hand visual servoing are essential to understanding this technique and its potential applications in robotics and automation. Further research in this area is necessary to develop more advanced algorithms and techniques for robotic eye-in-hand visual servoing.<\/p>\n\n\n\n<p><\/p>\n\n\n\n<p>CHAPTER 3<br>Materials and Methods<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"507\" height=\"380\" src=\"https:\/\/i0.wp.com\/gulfthai.com\/wp-content\/uploads\/2023\/06\/image-10.jpg?resize=507%2C380&#038;ssl=1\" alt=\"\" class=\"wp-image-8483\"\/><\/figure><\/div>\n\n\n<p>This chapter will discuss our Software and Real robot hardware. The software consists of the Program ROS and Depth camera.<br>3.1 VirtualBox<br>Before we install Ubuntu 20.04 version for use ROS Noetic, we have to install Oracle VM VirtualBox<\/p>\n\n\n\n<p>Figure 22 VirtualBox logo<br>VirtualBox is a free and open-source virtualization software developed by Oracle. It allows you to create and run virtual machines on your computer, which are essentially emulated computer systems that can run their own operating system and applications.<\/p>\n\n\n\n<p>Figure 23 VirtualBox 7.0.8 platform packages download Windows hosts<\/p>\n\n\n\n<p>3.1.1 Install and setup Ubuntu<br>Open the Ubuntu 20.04 LTS website. Go to this web https:\/\/releases.ubuntu.com\/focal\/ select 64-bit PC (AMD64) desktop image download for access to the full program options, so that you don&#8217;t have to download the server separately.<br>Open Oracle VM VirtualBox Manager and Click New It&#8217;s in the upper-left corner of the VirtualBox window. Doing so opens a pop-up menu.<\/p>\n\n\n\n<p>Figure 24 Oracle VM VirtualBox Manager<br>Enter a name for virtual machine. Type name virtual machine (e.g., Ubuntu) into the &#8220;Name&#8221; text field that&#8217;s near the top of the pop-up menu and Select Linux as the &#8220;Type&#8221; value then Select Ubuntu as the &#8220;Version&#8221; value match with ubuntu downloaded and click Next.<\/p>\n\n\n\n<p>Figure 25 Create VM<\/p>\n\n\n\n<p>Choose the base memory based on your host system RAM size and Processors CPU and click Next<\/p>\n\n\n\n<p>Figure 26 Choose the base memory VM<br>Choose the Virtual Hard Disk Size and click Next.<\/p>\n\n\n\n<p>Figure 27 Virtual Hard Disk Size<\/p>\n\n\n\n<p>Finally Virtual will summary setup and click Finish.<\/p>\n\n\n\n<p>Figure 28 Virtual summary<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Click Setting It's in the upper corner of the VirtualBox window.<\/code><\/pre>\n\n\n\n<p>Figure 29 Setting VM<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>Go to Storage and click Add hard disk select file Ubuntu that downloaded then click OK.<\/code><\/pre>\n\n\n\n<p>Figure 30 Storage Ubuntu<\/p>\n\n\n\n<p>Click Start It&#8217;s in the upper corner of the VirtualBox window to run the program and continue setup Ubuntu.<\/p>\n\n\n\n<p>Figure 31 Click Start Ubuntu<br>3.2 ROS<br>This Ubuntu 20.04 version supports only ROS Noetic. Go to this web http:\/\/wiki.ros.org\/noetic\/Installation\/Ubuntu to follow step install.<\/p>\n\n\n\n<p>Figure 32 Ubuntu 20.04 Linux<br>3.2.1 Install and setup ROS<br>Configure Ubuntu repositories<br>Configure Ubuntu repositories to allow &#8220;restricted,&#8221; &#8220;universe,&#8221; and &#8220;multiverse.&#8221; can follow the Ubuntu guide https:\/\/help.ubuntu.com\/community\/Repositories\/Ubuntu for instructions on doing this.<br>Setup sources. list<br>\u2022 Setup computer to accept software from packages.ros.org. Open terminal on ubuntu and type.<br>sudo sh -c &#8216;echo &#8220;deb http:\/\/packages.ros.org\/ros\/ubuntu $(lsb_release -sc) main&#8221; &gt; \/etc\/apt\/sources.list.d\/ros-latest.list&#8217;<\/p>\n\n\n\n<p>\u2022 Set up keys<br>sudo apt install curl # if you haven&#8217;t already installed curl<br>curl -s https:\/\/raw.githubusercontent.com\/ros\/rosdistro\/master\/ros.asc | sudo apt-key add &#8211;<\/p>\n\n\n\n<p>Installation<br>\u2022 First, make sure Debian package index is up-to-date:<br>sudo apt update<\/p>\n\n\n\n<p>Now pick of ROS install.<br>\u2022 Desktop-Full Install: (Recommended): Everything in Desktop plus 2D\/3D simulators and 2D\/3D perception packages<br>sudo apt install ros-noetic-desktop-full<\/p>\n\n\n\n<p>Environment setup<br>Must source this script in every bash terminal you use ROS in.<br>source \/opt\/ros\/noetic\/setup.bash<br>It can be convenient to automatically source this script every time a new shell is launched. These commands will do that.<br>\u2022 Bash<br>echo &#8220;source \/opt\/ros\/noetic\/setup.bash&#8221; &gt;&gt; ~\/.bashrc<br>source ~\/.bashrc<\/p>\n\n\n\n<p>Dependencies for building packages<br>Up to now have installed need to run the core ROS packages. To create and manage own ROS workspaces, there are various tools and requirements that are distributed separately.<br>\u2022 To install this tool and other dependencies for building ROS packages, run:<br>sudo apt install python3-rosdep python3-rosinstall python3-rosinstall-generator python3-wstool build-essential<\/p>\n\n\n\n<p>Initialize rosdep<br>Before can use many ROS tools, need to initialize rosdep. rosdep enables to easily install system dependencies for source want to compile and is required to run some core components in ROS<br>\u2022 Installed rosdep, do so as follows.<br>sudo apt install python3-rosdep<\/p>\n\n\n\n<p>\u2022 With the following, you can initialize rosdep.<br>sudo rosdep init<br>rosdep update<\/p>\n\n\n\n<p>Check ROS version.<br>\u2022 The rosversion command prints version information for ROS stacks and can also print the name of the active ROS distribution. Open terminal and type<br>rosversion -d<\/p>\n\n\n\n<p>Figure 33 check rosversion<\/p>\n\n\n\n<p>3.3 ROS with ABB<br>There are several repositories with experimental packages for ABB manipulators in ROS-Industrial. Specifically, for this project, the following has been used: https:\/\/github.com\/ros-industrial\/abb_experimental , which is based on the official ROS page: http:\/\/wiki.ros.org\/abb_experimental<br>Below are the steps for creating the workspace, cloning the said repository, installing all the necessary dependencies and finally building the workspace<br>3.3.1 Install and setup Package Summary<br>Creating a workspace for catkin<br>The catkin_make command is a convenience tool for working with catkin workspaces. Running it the first time in your workspace, it will create a CMakeLists.txt link in your &#8216;src&#8217; folder.<br>$ source \/opt\/ros\/noetic\/setup.bash<br>$ mkdir -p ~\/catkin_ws\/src<br>$ cd ~\/catkin_ws\/<br>$ catkin_make<\/p>\n\n\n\n<h1 class=\"wp-block-heading\">\/\/ or catkin build<\/h1>\n\n\n\n<p>Figure 34 catkin workspace<\/p>\n\n\n\n<p>Industrial_core<br>Go to http:\/\/wiki.ros.org\/industrial_core ,ROS-Industrial core stack contains packages and libraries for supporing industrial systems. This stack is part of the ROS Industrial program. It currently contains core packages that provide nodes and libraries for communication with industrial robot controllers. It also includes utilities and tools that are useful for industrial robotics and automation applications.<\/p>\n\n\n\n<p>Figure 35 ROS-Industrial Overview<br>The ROS-Industrial distribution contains metapackages for several industrial vendors. More information can be found on the http:\/\/wiki.ros.org\/Industrial\/supported_hardware .<br>\u2022 ABB<br>\u2022 Fanuc<br>\u2022 Kuka (under development, experimental)<br>\u2022 Motoman<br>\u2022 Robotiq<br>\u2022 Universal Robots<br>\u2022 Open terminal and type this command to install ROS-Industrial ROS package.<br>$ sudo apt install ros-noetic-industrial-core<br>$ rosdep update<br>$ catkin build<\/p>\n\n\n\n<p>Abb_driver<br>Go to web http:\/\/wiki.ros.org\/abb_driver ,This package is part of the ROS-Industrial program and contains nodes for interfacing with ABB industrial robot controllers. This repository contains a simple, RAPID based ROS driver for ABB industrial robots connected to IRC5 controllers. The driver is largely manipulator agnostic, and is expected to work with any ABB manipulator compatible with an IRC5 controller.<br>\u2022 Open terminal and type this command to install abb_driver ROS package.<br>$ sudo apt install ros-noetic-abb-driver<br>$ rosdep update<\/p>\n\n\n\n<p>\u2022 Install abb_driver library for Building the package Go to web https:\/\/github.com\/ros-industrial\/abb_driver.<\/p>\n\n\n\n<h1 class=\"wp-block-heading\">change to the root of the Catkin workspace<\/h1>\n\n\n\n<p>$ cd $HOME\/catkin_ws<\/p>\n\n\n\n<p>$ git clone -b kinetic-devel https:\/\/github.com\/ros-industrial\/abb_driver.git src\/abb_driver<\/p>\n\n\n\n<h1 class=\"wp-block-heading\">check for and install missing build dependencies.<\/h1>\n\n\n\n<h1 class=\"wp-block-heading\">first: update the local database<\/h1>\n\n\n\n<p>$ rosdep update<\/p>\n\n\n\n<h1 class=\"wp-block-heading\">Be sure to change &#8216; noetic &#8216; to whichever ROS version you are using<\/h1>\n\n\n\n<p>$ rosdep install &#8211;from-paths src\/ &#8211;ignore-src &#8211;rosdistro noetic<\/p>\n\n\n\n<h1 class=\"wp-block-heading\">build the workspace (using catkin_tools)<\/h1>\n\n\n\n<p>$ catkin build<\/p>\n\n\n\n<h1 class=\"wp-block-heading\">Activating the workspace<\/h1>\n\n\n\n<p>$source $HOME\/catkin_ws\/devel\/setup.bash<\/p>\n\n\n\n<p>ABB Experimental<br>Go to web https:\/\/github.com\/ros-industrial\/abb_experimental This repository is part of the ROS-Industrial program. It currently contains packages that provide nodes for communication with ABB industrial robot controllers, URDF models for supported manipulators and associated MoveIt packages.<br>\u2022 It contains experimental packages that will be moved to the abb package once they&#8217;ve received sufficient testing and review. Open terminal and type this command to install ABB Experimental<br>$ sudo apt install ros-noetic-abb-experimental<br>$ rosdep update<\/p>\n\n\n\n<p>\u2022 Install ABB Experimental all robot library for Building the package<\/p>\n\n\n\n<p>change to the root of the Catkin workspace<\/p>\n\n\n\n<p>$ cd $HOME\/catkin_ws<\/p>\n\n\n\n<h1 class=\"wp-block-heading\">retrieve the latest development version of the abb repository. If you&#8217;d rather<\/h1>\n\n\n\n<h1 class=\"wp-block-heading\">use the latest released version, replace &#8216; noetic -devel&#8217; with &#8216; noetic &#8216;<\/h1>\n\n\n\n<p>$ git clone -b noetic c-devel https:\/\/github.com\/ros-industrial\/abb.git src\/abb<\/p>\n\n\n\n<p>retrieve the latest development version of abb_experimental<\/p>\n\n\n\n<p>$ git clone -b noetic -devel https:\/\/github.com\/ros-industrial\/abb_experimental.git src\/abb_experimental<\/p>\n\n\n\n<p>check build dependencies. Note: this may install additional packages,<\/p>\n\n\n\n<p>depending on the software installed on the machine<\/p>\n\n\n\n<p>$ rosdep update<\/p>\n\n\n\n<p>be sure to change &#8216; noetic &#8216; to whichever ROS release you are using<\/p>\n\n\n\n<p>$ rosdep install &#8211;from-paths src\/ &#8211;ignore-src &#8211;rosdistro noetic<\/p>\n\n\n\n<p>build the workspace (using catkin_tools)<\/p>\n\n\n\n<p>$ catkin build<\/p>\n\n\n\n<p>Activating the workspace<\/p>\n\n\n\n<p>$source $HOME\/catkin_ws\/devel\/setup.bash<\/p>\n\n\n\n<p>Figure 36 ABB Experimental files<\/p>\n\n\n\n<p>3.3.2 Simulation ABB with RViz and Gazebo<br>This package contains the files required to simulate the ABB IRB 120 manipulator (and variants) in Gazebo.<\/p>\n\n\n\n<p>Figure 37 ABB IRB 120 manipulator flies<br>Using Moveit! with Gazebo Simulator open terminal and type commands<br>\u2022 Bring the robot model into gazebo and load the ros_control controllers:<br>roslaunch abb_irb120_gazebo irb120_3_58_gazebo.launch<\/p>\n\n\n\n<p>\u2022 Open new terminal and type commands Launch Moveit! and ensure that it is configured to run alongside Gazebo:<br>roslaunch abb_irb120_moveit_config moveit_planning_execution_gazebo.launch<\/p>\n\n\n\n<p>Figure 38 ABB with RViz and Gazebo<br>To move the robot and check that the two tools are connected to each other, we position ourselves in RVIZ and look at the MotionPlanning window. In the first box, Context, we have the option to select the OMPL library that we want to use. The Open Motion Planning Library is a powerful collection of state-of-the-art sample-based motion planning algorithms and is the default scheduler in MoveIt. In our case select RRTConnectkConfigDefault.<\/p>\n\n\n\n<p>Figure 39 OMPL RRTConnectkConfigDefault.<br>The robot is shown by default, in its previously established initial position. we can do it manually by clicking on the Interactive Marker and holding with the mouse move it to the desired position. However, to use MOVEIT trajectory planner we must go to the Planning box, where we can send different positions to the robot. To do this, we position ourselves on the Query tab and select a goal, which can be either a valid or invalid random position generated by RVIZ, or a position that has been saved by default, such as the &#8220;home&#8221; position. By &#8220;clicking&#8221; on the Update box, the robot is established in the new position and by clicking on the Plan and Execute button, it will carry out the trajectory to that point both in RVIZ and in Gazebo.<\/p>\n\n\n\n<p>Figure 40 Planning box<br>In the following figures 37, the robot is shown in the final position in both Gazebo and RVIZ.<\/p>\n\n\n\n<p>3.3.3 Simulation ABB with RViz and Robot Studio<br>Setting Up Robot Studio for Simulated ROS Control. This repository contains a simple, RAPID based ROS driver for ABB industrial robots connected to IRC5 controllers. Go to https:\/\/github.com\/ros-industrial\/abb_driver ( Make sure download and install on ubuntu and download on window for set up robot studio)<br>For first open robot studio and Select robot IRB 120<\/p>\n\n\n\n<p>Figure 41 RobotStudio IRB120<\/p>\n\n\n\n<p>Then Add controller from layout and click option<\/p>\n\n\n\n<p>Figure 42 Add controller Robotstudio<br>Select the following controller options are required:<br>\u2022 623-1: Multitasking<br>\u2022 672-1: Socket Messaging (in recent RobotWare versions, this option is included with 616-1: PC Interface)<br>The ABB ROS Server code is written in RAPID, using a socket interface and multiple parallel tasks.<\/p>\n\n\n\n<p>Figure 43 Controller options<\/p>\n\n\n\n<p>Copy the files download form https:\/\/github.com\/ros-industrial\/abb_driver to the virtual controller of robot studio file that we create.<\/p>\n\n\n\n<p>Figure 44 Copy the files ROS to folder Robot studio create<br>File Overview<br>\u2022 Shared by all tasks<br>o ROS_common.sys &#8212; Global variables and data types shared by all files<br>o ROS_socket.sys &#8212; Socket handling and simple_message implementation<br>o ROS_messages.sys &#8212; Implementation of specific message types<br>\u2022 Specific task modules<br>o ROS_stateServer.mod &#8212; Broadcast joint position and state data<br>o ROS_motionServer.mod &#8212; Receive robot motion commands<br>o ROS_motion.mod &#8212; Issues motion commands to the robot<br>Then save and restart file. Go to tap controller RAPID and check File Overview.<\/p>\n\n\n\n<p>Socket-Server Tasks (GetSysInfo () patch)<br>As the GetSysInfo (..) function does not return a valid IP address when used in RobotStudio (it returns &#8220;VC&#8221; instead of the IP of your Windows machine), we need to change something in the ROS_socket.sys source file.<\/p>\n\n\n\n<p>Make sure Windows PC has a static IP configured. If workstation does not have a static IP address, will have to repeat the below changes each time IP address changes<\/p>\n\n\n\n<p>Figure 45 Network Connection windows<br>Type \u201cIpconfig\u201d on command prompt to check IPv4 address. Change the IP address of Ethernet that use for Ubuntu (In this network use Ethernet 7) to 192.168.56.1<\/p>\n\n\n\n<p>Figure 46 Ipconfig command prompt<br>Ethernet adapter IP address is 192.168.56.1 because have to connect to Virtual Box Host-only Ethernet adapter. Go to setting Network on Ubuntu Host-only Adapter.<\/p>\n\n\n\n<p>Figure 47 setting Network VM<br>\u2022 Now open ROS_socket.sys (Open on robot studio or Notepad) and change the following line:<br>IF (SocketGetStatus(server_socket) = SOCKET_CREATED) SocketBind server_socket, GetSysInfo(\\LanIp), port;<br>\u2022 into:<br>IF (SocketGetStatus(server_socket) = SOCKET_CREATED) SocketBind server_socket, &#8220;192.168.56.1&#8221;, port;<\/p>\n\n\n\n<p>Figure 48 ROS_socket.sys RAPID<br>Configuring Controller Settings<br>All files in the abb_driver\/rapid (Noetic and later) directory should be copied to the robot controller. This tutorial assumes the files are copied to a &#8220;ROS&#8221; subdirectory under the system&#8217;s HOME directory (e.g. \/\/HOME\/ROS\/*) or catkin work space that we make or build.<br>Open the FlexPendant on robot studio or go to tap controller for setting configuration.<\/p>\n\n\n\n<p>Figure 49 Open the FlexPendant<br>For Flexpendant running in MANUAL mode for setting controllers.<\/p>\n\n\n\n<p>Figure 50 Flexpendant running in MANUAL<\/p>\n\n\n\n<p>Create Tasks<br>\u2022 Browse to Controller tab \u2192 Configuration Editor \u2192 Controller \u2192 Task, then right-click New Task<br>\u2022 (In RobotStudio 5, this is found under ABB \u2192 Control Panel \u2192 Configuration \u2192 Topics \u2192 Controller \u2192 Task)<br>\u2022 Create 3 tasks as follows:<br>Table 1 Create Tasks<br>Name Type Trust Level Entry Motion Task<br>ROS_StateServer SEMISTATIC NoSafety main NO<br>ROS_MotionServer SEMISTATIC SysStop main NO<br>T_ROB1 NORMAL main YES<\/p>\n\n\n\n<p>Figure 51 Create Tasks on Flexpendant<\/p>\n\n\n\n<p>Figure 52 Create Tasks on Robotstudio<br>Create Signals<br>\u2022 Browse to Controller tab \u2192 Configuration Editor \u2192 I\/O System \u2192 Signal, then right-click New Signal<br>\u2022 (In RobotStudio 5, this is found under ABB \u2192 Control Panel \u2192 Configuration \u2192 Topics \u2192 I\/O \u2192 Signal)<br>\u2022 Create 7 signals as follows<br>Table 2 Create Signals<br>Name Type of Signal<br>signalExecutionError Digital Output<br>signalMotionPossible Digital Output<br>signalMotorOn Digital Output<br>signalRobotActive Digital Output<br>signalRobotEStop Digital Output<br>signalRobotNotMoving Digital Output<\/p>\n\n\n\n<p>Figure 53 Create Signals on Flexpendant<\/p>\n\n\n\n<p>Figure 54 Create Signals on Robotstudio<br>Tie Signals to the System Outputs<br>\u2022 Browse to Controller tab \u2192 Configuration Editor \u2192 I\/O System \u2192 System Output, then right-click New System Output<br>\u2022 (In RobotStudio 5, this is found under the ABB \u2192 Control Panel \u2192 Configuration \u2192 Topics \u2192 I\/O \u2192 System Output)<br>\u2022 Add one entry for signal as follows:<br>Table 3 Tie Signals to the System Outputs<br>Signal Name Status Arg 1 Arg 2 Arg 3 Arg 4<br>signalExecutionError Execution Error N\/A T_ROB1 N\/A N\/A<br>signalMotionPossible Runchain OK N\/A N\/A N\/A N\/A<br>signalMotorOn Motors On State N\/A N\/A N\/A N\/A<br>signalRobotActive Mechanical Unit Active ROB_1 N\/A N\/A N\/A<br>signalRobotEStop Emergency Stop N\/A N\/A N\/A N\/A<br>signalRobotNotMoving Mechanical Unit Not Moving ROB_1 N\/A N\/A N\/A<br>signalRosMotionTaskExecuting Task Executing N\/A T_ROB1 N\/A N\/A<\/p>\n\n\n\n<p>Figure 55 Signals to the System Outputs on flexpendant<\/p>\n\n\n\n<p>Figure 56 Signals to the System Outputson Robotstudio<br>Load Modules to Tasks<br>\u2022 Browse to Controller tab \u2192 Configuration Editor \u2192 Controller \u2192 Automatic Loading of Modules, then right-click New Automatic Loading of Modules<br>\u2022 (In RobotStudio 5, this is found under ABB \u2192 Control Panel \u2192 Configuration \u2192 Topics \u2192 Controller \u2192 Automatic Loading of Modules)<br>\u2022 Add one entry for each server file as follows<br>Table 4 Load Modules to Tasks<br>File Task Installed All Tasks Hidden<br>HOME:\/ROS\/ROS_common.sys NO YES NO<br>HOME:\/ROS\/ROS_socket.sys NO YES NO<br>HOME:\/ROS\/ROS_messages.sys NO YES NO<br>HOME:\/ROS\/ROS_stateServer.mod ROS_StateServer NO NO NO<br>HOME:\/ROS\/ROS_motionServer.mod ROS_MotionServer NO NO NO<br>HOME:\/ROS\/ROS_motion.mod T_ROB1 NO NO NO<\/p>\n\n\n\n<p>Figure 57 Load Modules to Tasks on Flexpendent<\/p>\n\n\n\n<p>Figure 58 Load Modules to Tasks on Robotstudio<br>After the last change, select YES to restart the controller and apply the changes.Then go to program editor select Tasks and programs set modules like this<\/p>\n\n\n\n<p>Figure 59 Editor select Tasks and programs set modules<\/p>\n\n\n\n<p>Running in AUTO mode and PP to Main to run programs<\/p>\n\n\n\n<p>Figure 60 AUTO mode Flexpendent<br>Run the program<br>Click Play the robot waiting for connection<\/p>\n\n\n\n<p>Figure 61 Flexpendent waiting for connection<\/p>\n\n\n\n<p>Open terminal Type \u201cPing 192.168.56.1\u201d IP address on Ubuntu to connect with robot studio on window<\/p>\n\n\n\n<p>Figure 62 Ping 192.168.56.1 on Ubuntu<br>\u2022 Open new terminal run RViz abb robot and set IP address to connect robot IP address.<br>$ roslaunch abb_irb120_moveit_config moveit_planning_execution.launch sim:=false robot_ip:=192.168.56.1<\/p>\n\n\n\n<p>Figure 63 abb_irb120_moveit<\/p>\n\n\n\n<p>When the program open, Robotstudio will connection with RIVZ and can control robot IRB 120 selected main to pp in tap production window.<\/p>\n\n\n\n<p>Figure 64 Flexpendent connection with ROS<br>We can move the robot by use interface active marker or set the Goal state to random the position of robot.When move the robot then click Plan &amp; Execute Robot, RViz will set the data points to motion task to robot studio on Flexpendent.<\/p>\n\n\n\n<p>Figure 65 robot studio and Moveit<\/p>\n\n\n\n<p>Click play button on Flexpendent the robot will run to start the points and stop &amp; end received.<\/p>\n\n\n\n<p>Figure 66 robot studio and Moveit comunnication<br>3.3.4 Simulation ABB with RViz and Real robot IRB120<br>The connection between ROS and the real robot is done almost in the same way. First of all, the IRC5 controller of the robot has to be configured as it was done in the simulation, following all the steps but with the real FlexPendant and Use X2 in controller for TCP\/IP (Make sure set up all Configuring Controller Settings on real FlexPendant).<\/p>\n\n\n\n<p>Figure 67 IRC5 controller<br>Set the IP sever socket \u201c192.168.125.1\u201d on IRC5 controller robot. the IP address in code socket will be the first server for make another port connect match will the IRC5 controller.<\/p>\n\n\n\n<p>Figure 68 IP sever socket \u201c192.168.125.1\u201d<br>Set Ethernet status IP Address 192.168.125.2 to on computer windows for connect IRC5 controllers. Go to Control Panel \u2192Network and Internet \u2192 Network Connections\u2192 Ethernet Status.<\/p>\n\n\n\n<p>Figure 69 Ethernet Status. IP 192.168.125.1<\/p>\n\n\n\n<p>Open Details for set IP 192.168.125.2 Subnet mask 255.255.255.0<\/p>\n\n\n\n<p>Figure 70 IP 192.168.125.2 Subnet mask 255.255.255.0<br>Click \u201cAdd controller\u201d to connect IP address with robot studio.<\/p>\n\n\n\n<p>Figure 71 Add controller\u201d to connect IP<br>When connect With IRC5 controller the tap controller with show the management port connected.<\/p>\n\n\n\n<p>Figure 72 IRC5 controller port connected.<\/p>\n\n\n\n<p>Click \u201cRequest write access\u201d and click revoke tap for allow access upload code and edit on real Flexpendent from robot studio.<\/p>\n\n\n\n<p>Figure 73 Request write access Flexpendant<br>Click \u201cCreate Relation\u201d from transfer all data and Rapid code.<\/p>\n\n\n\n<p>Figure 74 Create Relation<br>In Create Relation type the Relation name and set the First controller to Robot studio Station that created and set the Second controller to the IRC5 controller.<\/p>\n\n\n\n<p>Figure 75 Create Relation type<br>Now can Transfer all data code from source to the target Click \u201cTransfer now\u201d. (In this transfer can Change Direction Source and Target)<\/p>\n\n\n\n<p>Figure 76 Transfer all data<br>When Click \u201cTransfer now\u201d the program will show the Transfer Summary then Click Yes to continue upload.<\/p>\n\n\n\n<p>Figure 77 Transfer Summary<br>Finally, we restart from RobotStudio the real controller. To do this, click on the real controller and in the Controller section, select Restart &#8211; Reset RAPID (P-start).<\/p>\n\n\n\n<p>Figure 78 Reset RAPID (P-start).<br>If everything has worked correctly, Will see on the FlexPendant the notification that the connection is waiting.<\/p>\n\n\n\n<p>Figure 79 FlexPendant the notification connection waiting.<\/p>\n\n\n\n<p>Now can establish the connection between the robot controller IRC5 and the Noetic ROS laptop via the Ethernet cable. Go to Network setting on Ubuntu change Ethernet connected.<\/p>\n\n\n\n<p>Figure 80 Network setting on Ubuntu<br>Set the IPv4 Method to manual and set Addresses \u201c192.168.125.5\u201d and Netmask \u201c255.255.255.0\u201d then apply.<\/p>\n\n\n\n<p>Figure 81 Set the IPv4 Method<br>Next reset the Ethernet Ubuntu turn on\/off connected.<\/p>\n\n\n\n<p>Figure 82 Ubuntu turn on\/off connected.<\/p>\n\n\n\n<p>Go to Ethernet connected setting and check the IPv4<\/p>\n\n\n\n<p>Figure 83 Ethernet connected Ubuntu<br>Now can simulation the robot IRB120 with RViz Moveit! via noetic Ros<br>\u2022 Open the terminal on ubuntu and type this command to ping Ethernet connected<br>ping 198.162.125.1<\/p>\n\n\n\n<p>\u2022 Open new terminal types this command to run RViz abb robot and set IP address to connect robot IP address.<br>roslaunch abb_irb120_moveit_config moveit_planning_execution.launch sim:=false robot_ip:=192.168.125.1<\/p>\n\n\n\n<p>When the program open, Flexpendent will connection with RViz and can control real robot IRB120.<\/p>\n\n\n\n<p>Figure 84 Flexpendent connection<br>Set the IRC5 controller to the auto mode for make robot move automation<\/p>\n\n\n\n<p>Figure 85 the IRC5 controller auto mode<br>Move the robot by interface active marker of robot or set the Goal state to random the position of robot then Plan &amp; Execute Robot, RViz will set the data points for motion task to Flexpendent.<\/p>\n\n\n\n<p>Figure 86 Data points to Flexpendent<br>Then selected main to pp in tap production window click play button on Flexpendent to start movement of robot and continue Plan &amp; Execute the position axis.<\/p>\n\n\n\n<p>Figure 87 Robot Plan &amp; Execute the position axis.<\/p>\n\n\n\n<p>\u2003<br>3.4 RealSense D435i depth camera<br>The RealSense D435i depth camera is a versatile device that provides both depth and RGB imaging capabilities, along with integrated IMU sensors, making it suitable for a wide range of applications that require accurate spatial perception.<br>To connect the RealSense D435i depth camera in Ubuntu, you&#8217;ll need to follow these steps:<br>I. Install the RealSense SDK:<br>\u2022 Open a terminal in Ubuntu.<br>\u2022 Update the package list by running the command:<br>$ sudo apt-get update<br>\u2022 Install the RealSense SDK by running the following commands:<br>$ sudo apt-get install librealsense2-dkms<br>$ sudo apt-get install librealsense2-utils<\/p>\n\n\n\n<p>II. Connect the camera:<\/p>\n\n\n\n<p>Figure 88 USB connect the camera<br>\u2022 Plug in the RealSense D435i camera to a USB 3.0 port on Ubuntu machine.<br>\u2022 Ensure that the camera is powered on.<\/p>\n\n\n\n<p>\u2003<br>III. Verify camera detection:<br>\u2022 Open a terminal and run the command:<br>$ lsusb<br>\u2022 Look for an entry in the output that corresponds to the RealSense camera. It may be listed as &#8220;Intel Corp.&#8221; or something similar. This confirms that the camera is detected by Ubuntu.<\/p>\n\n\n\n<p>IV. Check camera functionality:<br>\u2022 In the terminal, run the command:<br>$ realsense-viewer<br>\u2022 This will launch the RealSense Viewer application, which provides a graphical interface to interact with the camera.<br>\u2022 In the RealSense Viewer, you should see the camera feed, depth data, and various camera settings.<br>\u2022 If the camera is functioning properly, you should be able to see the camera feed in the RealSense Viewer.<\/p>\n\n\n\n<p>V. Install the RealSense ROS Package:<br>\u2022 Install the RealSense ROS Package in direct project catkin workspace folder by run the following commands:<br>$ cd ~\/catkin_ws\/src<br>\u2022 Open a terminal in Ubuntu and run the following commands:<br>$ sudo apt-get update<br>$ sudo apt-get install ros-noetic-realsense2-camera<br>$ sudo apt-get install ros-noetic-realsense2-description<br>\u2022 These commands will install the RealSense SDK and the ROS packages necessary to use the RealSense camera in ROS.<\/p>\n\n\n\n<p>VI. Published the RealSense ROS Node:<br>\u2022 The published topics differ according to the device and parameters. By run the following commands to start the camera node in ROS:<br>$ roslaunch realsense2_camera rs_camera.launch<br>\u2022 This will launch the camera node, which will start publishing the camera feed and depth data to ROS topics.<br>\u2022 command with D435i attached, the following list of topics will be available<\/p>\n\n\n\n<p>Figure 89 D435i list of topics<br>\u2022 Used these topics published into Aruco Marker program to get parameter from Realsence camera \u2003<br>3.5 Aruco maker detected.<br>To utilize Aruco markers in Ubuntu with ROS Noetic, please adhere to the following steps:<br>I. Install the required packages: Open a terminal and install the necessary ROS packages for using Aruco markers by running the following command:<br>$ sudo apt-get install ros-noetic-aruco-ros<\/p>\n\n\n\n<p>II. Cloning the aruco_ros package: Within catkin workspace directory, please proceed to the &#8216;src&#8217; folder and clone the aruco_ros package from GitHub by executing the following steps:<br>$ cd ~\/catkin_ws\/src<br>$ git clone https:\/\/github.com\/pal-robotics\/aruco_ros.git<br>$ catkin build<\/p>\n\n\n\n<p>III. Modify Aruco code to use in project.<br>\u2022 In order to modify the Aruco code and make adjustments to the camera node and Aruco details within ROS Noetic, it is necessary to perform modifications to the configuration files and the code of the Aruco package. Please follow the steps outlined below for accomplishing this task in a formal manner.<br>\u2022 Locating the Aruco package: Within catkin workspace, kindly navigate to the &#8216;src&#8217; directory where you have previously cloned the aruco_ros package. Please refer to the following steps to accomplish this in a formal manner:<br>$ cd ~\/catkin_ws\/src\/aruco_ros\/aruco_ros\/launch<\/p>\n\n\n\n<p>\u2003<br>\u2022 Modify the camera settings: Open the launch file that corresponds to the camera you want to use. For example, if you want to modify the settings for camera, open the single.launch file:<br>$ gedit single.launch<br>\u2022 Within this file, you have the ability to modify parameters that are relevant to marker detection, including but not limited to marker size, dictionary type, marker detection threshold, and other related settings. Please review and adjust these parameters according to specific requirements. Or adjust on my parameters according to project.<br>o Set \u201cmarkerSize\u201d = 0.096<br>To define the real-world dimension of marker in unit meter (m.)<br>o Set &#8220;markerId&#8221; = 701<br>To allow program to detect this Marker only.<br>o Set &#8220;camera_frame&#8221; = camera_frame<br>To define TF frame camera in Robot arm<br>o Set &#8220;marker_frame&#8221; = aruco_marker_frame<br>To define name of Aruco mark TF frame<br>o Set \u201cref_frame\u201d = base_link<br>To define reference TF position of Aruco Mark<br>o Set &#8220;\/camera_info&#8221; = \/camera\/color\/camera_info<br>To define node of RealSense RGB camera<br>o Set &#8220;\/image&#8221; = \/camera\/color\/image_raw<br>To define node of Realsense RGB camera parameters<br>\u2003<br><br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>&lt;arg name=\"markerId\"        default=\"701\"\/&gt;\n&lt;arg name=\"markerSize\"      default=\"0.096\"\/&gt;    &lt;!-- in m --&gt;\n&lt;arg name=\"eye\"             default=\"left\"\/&gt;\n&lt;arg name=\"camera_frame\"    default=\"camera_frame\"\/&gt;\n&lt;arg name=\"marker_frame\"    default=\"aruco_marker_frame\"\/&gt;\n&lt;arg name=\"ref_frame\"       default=\"base_link\"\/&gt;  &lt;!-- leave empty and the pose will be published wrt param parent_name --&gt;\n&lt;arg name=\"corner_refinement\" default=\"LINES\" \/&gt; &lt;!-- NONE, HARRIS, LINES, SUBPIX --&gt;\n\n&lt;!-- start ArUco --&gt;\n&lt;node pkg=\"aruco_ros\" type=\"single\" name=\"aruco_single\"&gt;\n    &lt;remap from=\"\/camera_info\" to=\"\/camera\/color\/camera_info\" \/&gt;\n    &lt;remap from=\"\/image\" to=\"\/camera\/color\/image_raw\" \/&gt;\n    &lt;param name=\"image_is_rectified\" value=\"True\"\/&gt;\n    &lt;param name=\"marker_size\"        value=\"$(arg markerSize)\"\/&gt;\n    &lt;param name=\"marker_id\"          value=\"$(arg markerId)\"\/&gt;\n    &lt;param name=\"reference_frame\"    value=\"$(arg ref_frame)\"\/&gt;   &lt;!-- frame in which the marker pose will be refered $(arg ref_frame) --&gt; \n&lt;param name=\"camera_frame\" value=\"$(arg camera_frame)\"\/&gt;\n&lt;param name=\"marker_frame\" value=\"$(arg marker_frame)\" \/&gt;\n    &lt;param name=\"corner_refinement\"  value=\"$(arg corner_refinement)\" \/&gt;\n&lt;\/node&gt;<\/code><\/pre>\n\n\n\n<p>IV. Published the Aruco ROS Node:<br>\u2022 Launch the Aruco marker detection with the updated settings: Launch the Aruco marker detection node using the modified settings by running the appropriate launch file. For example, to launch the RealSense camera with single marker detection:<br>$ roslaunch aruco_ros usb_single.launch<br>\u2022 This will launch the Aruco marker node, which will start publishing the pose estimate position and orientation data to ROS topics.<br>\u2022 Mention that you need to run node camera before used Aruco marker.<br>\u2022 The following list of topics will be available, can Published by run the following command:<br>$ rqt<br>\u2022 Subscribe the ROS topic aruco marker to show pose estimate position and orientation, by run the following command:<br>$ rostopic echo \/aruco_single\/pose<\/p>\n\n\n\n<p>Figure 90 Aruco marker detection program<\/p>\n\n\n\n<p>3.6 Visual servoing<br>This section serves the purpose of creating a visual servoing system utilizing an ABB robot (irb120) and Aruco marker. The objective is to track a specific marker and generate a path from the starting pose to the goal pose of the robot within the ROS Noetic framework.<\/p>\n\n\n\n<p>I. Locate project folder catkin workspace and create catkin package by run the following command.<br>\u2022 Open a terminal and navigate to Catkin workspace directory:<br>\u2022 Create a new Catkin package using the catkin_create_pkg command.<br>$ cd ~\/catkin_ws\/src<br>$ catkin_create_pkg visual_servoing roscpp std_msgs sensor_msgs geometry_msgs moveit_core moveit_ros_planning moveit_ros_planning_interface moveit_visual_tools<br>\u2022 Upon executing the &#8220;catkin_create_pkg&#8221; command, a newly created directory will appear, bearing the name of package. Inside this directory, you will discover a collection of files and folders that constitute package.<br>\u2022 It is possible to incorporate supplementary files into package, such as source code files, launch files, or configuration files, among others.<br>\u2022 Build catkin_ws using the \u2018catkin build\u2019 command:<br>$ catkin build<\/p>\n\n\n\n<p>I. Create C++ program to write visual servoing system in \u2018src\u2019 in visual_servoing package folder. By run the following command:<br>$ cd ~\/catkin_ws\/src\/visual_servoing\/src<br>$ gedit robot_arm_controller.cpp<\/p>\n\n\n\n<p>\u2003<br>\u2022 Write C++ follow the instruction code below:<\/p>\n\n\n\n<p>static const std::string PLANNING_GROUP = &#8220;manipulator&#8221;;<br>double x, y, z, qx, qy, qz, qw;<\/p>\n\n\n\n<p>void poseCallback()<br>{<br>\/\/ Check if all variables are 0.0 and return if true<br>if (x == 0.0 &amp;&amp; y == 0.0 &amp;&amp; z == 0.0 &amp;&amp; qx == 0.0 &amp;&amp; qy == 0.0 &amp;&amp; qz == 0.0)<br>{<br>return;<br>}<br>ROS_INFO(&#8220;fUNCTION posecallback&#8221;);<br>ROS_INFO(&#8220;Received Aruco_ros pose: x=%f, y=%f, z=%f, qx=%f, qy=%f, qz=%f, qw=%f&#8221;, x, y, z, qx, qy, qz, qw);<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>moveit::planning_interface::MoveGroupInterface move_group(PLANNING_GROUP);\nmove_group.setPlanningTime(3.0);\nmove_group.setNumPlanningAttempts(1);\nmove_group.setMaxVelocityScalingFactor(0.8);        \/\/ Speed 80%\nmove_group.setMaxAccelerationScalingFactor(0.8);    \/\/ Speed 80%\nmove_group.setGoalTolerance(0.001);\ngeometry_msgs::Pose target_pose;\ntarget_pose.position.x = x;\ntarget_pose.position.y = y;\ntarget_pose.position.z = z;\ntarget_pose.orientation.x = qx;\ntarget_pose.orientation.y = qy;\ntarget_pose.orientation.z = qz;\ntarget_pose.orientation.w = qw;\n\nmove_group.setPoseTarget(target_pose);\nmoveit::planning_interface::MoveGroupInterface::Plan my_plan;\nbool success = (move_group.plan(my_plan) == moveit::planning_interface::MoveItErrorCode::SUCCESS);\nif (success) {\n    move_group.execute(my_plan);\n    ROS_INFO(\"Move to target pose success!\");\n} else {\n    ROS_WARN(\"Move to target pose failed!\");\n}\nx = 0.0;\ny = 0.0;    \nz = 0.0;    \nqx = 0.0;   \nqy = 0.0;   \nqz = 0.0;   \nqw = 0.0;   \nROS_INFO(\"end loop\");<\/code><\/pre>\n\n\n\n<p>}<\/p>\n\n\n\n<p>void callmsgs(const geometry_msgs::PoseStamped::ConstPtr&amp; msg)<br>{<br>\/\/ Extract position information from the PoseStamped message<br>x = msg-&gt;pose.position.x- 0.4;<br>y = msg-&gt;pose.position.y;<br>z = msg-&gt;pose.position.z;<br>\/\/ Extract orientation information from the PoseStamped message<br>qx = msg-&gt;pose.orientation.x;<br>qy = msg-&gt;pose.orientation.y;<br>qz = msg-&gt;pose.orientation.z;<br>qw = msg-&gt;pose.orientation.w + 0.432050;<br>}<\/p>\n\n\n\n<p>int main(int argc, char** argv)<br>{<br>ROS_INFO(&#8220;START&#8221;);<br>ros::init(argc, argv, &#8220;move_to_aruco_pose&#8221;);<br>ros::NodeHandle nh;<br>ros::AsyncSpinner spinner(2); \/\/ Use async spinner instead of multi-threaded<br>spinner.start();<br>ros::Subscriber sub = nh.subscribe(&#8220;aruco_single\/pose&#8221;, 1, callmsgs);<br>while (ros::ok())<br>{<br>poseCallback();<br>}<br>ROS_INFO(&#8220;END&#8221;);<br>ros::waitForShutdown();<br>return 0;<br>}<br>\u2022 This code receives pose estimate from Aruco marker topic and joint sensers from robot to define start pose and goal pose using Moveit API to planning path that auto create my used OMPL library, then check the collision after sending path to move the real robot arm.<br>II. To include the &#8220;robot_arm_controller.cpp&#8221; file in the CMakeLists.txt file within the visual_servoing package, by run follow these steps:<br>\u2022 Open a terminal and navigate to Catkin_ws package directory. Assuming package is named visual_servoing, you can use the following command:<br>$ cd \/path\/to\/catkin_ws\/src\/my_package<br>\u2022 Open the CMakeLists.txt file using a text editor of your choice. For example:<br>$ gedit CMakeLists.txt<br>\u2022 Inside the CMakeLists.txt file, you will find a section for adding source files to your package. Look for a line that starts with \u2018add_executable\u2019 and \u2018target_link_libraries\u2019.<br>add_executable(robot_arm_controller src\/robot_arm_controller.cpp)<br>target_link_libraries(robot_arm_controller ${catkin_LIBRARIES} ${Boost_LIBRARIES})<br>Adjust the path and filename according to the actual location of your robot_arm_controller.cpp file within your package.<br>\u2022 Rebuild your Catkin workspace using the \u2018catkin build\u2019 command:<br>$ Catkin build<br>This will compile package with the newly added source file.<\/p>\n\n\n\n<p>III. Start the robot_arm_controller.cpp node in catkin_ws workspace, follow these steps:<br>\u2022 Build your Catkin workspace using the catkin_make command:<br>$ catkin build<br>This step is important to ensure that your package and its dependencies are compiled and built correctly.<br>\u2022 Start the robot_arm_controller node using the rosrun command:<br>$ rosrun visual_ servoing robot_arm_controller.cpp<br>\u2003<br>3.8 Collision Safety.<br>The purpose of adding collision objects to the planning scene is to provide MoveIt with information about obstacles or objects that the robot must avoid during its motion planning. By adding these collision objects, MoveIt can generate collision-free paths for the robot to follow.<\/p>\n\n\n\n<p>Figure 91 Collision Safety.<br>I. Create C++ program to write visual servoing system in \u2018src\u2019 in visual_servoing package folder. By run the following command:<br>$ cd ~\/catkin_ws\/src\/visual_servoing\/src<br>$ gedit add_collision_object.cpp<\/p>\n\n\n\n<p>II. In this code, three collision objects are added to the planning scene: &#8220;Table&#8221;, &#8220;Controller&#8221;, and &#8220;wall&#8221;. The &#8220;Table&#8221; and &#8220;Controller&#8221; objects represent obstacles that the robot must avoid, while the &#8220;wall&#8221; object represents a physical boundary that the robot cannot pass through. It can also define the dimensions of objects and the XYZ position of their poses in Rviz.<\/p>\n\n\n\n<p>using namespace moveit::planning_interface;<\/p>\n\n\n\n<p>int main(int argc, char** argv)<br>{<br>ros::init(argc, argv, &#8220;move_robot&#8221;);<br>ros::NodeHandle nh;<\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>\/\/ Create a MoveGroupInterface instance\nMoveGroupInterface move_group(\"manipulator\");\n\n\/\/ Set the planning time and number of planning attempts\nmove_group.setPlanningTime(5.0);\nmove_group.setNumPlanningAttempts(10);\n\n\/\/ Create a PlanningSceneInterface instance\nPlanningSceneInterface planning_scene_interface;\n\n\/\/ Create and add the first collision object\nmoveit_msgs::CollisionObject collision_object1;\ncollision_object1.header.frame_id = move_group.getPlanningFrame();\ncollision_object1.id = \"Table\";\nshape_msgs::SolidPrimitive Table;\nTable.type = shape_msgs::SolidPrimitive::BOX;\nTable.dimensions = {1.01, 0.85, 0.79};\ngeometry_msgs::Pose pose1;\npose1.position.x = 0.0;\npose1.position.y = 0.0;\npose1.position.z = -0.4;\npose1.orientation.w = 1.0;\ncollision_object1.primitives.push_back(Table);\ncollision_object1.primitive_poses.push_back(pose1);\n\nplanning_scene_interface.applyCollisionObject(collision_object1);\nROS_INFO(\"Added Table into the world\");\n\n\/\/ Create and add the second collision object\nmoveit_msgs::CollisionObject collision_object2;\ncollision_object2.header.frame_id = move_group.getPlanningFrame();\ncollision_object2.id = \"Controller\";\nshape_msgs::SolidPrimitive Controller;\nController.type = shape_msgs::SolidPrimitive::BOX;\nController.dimensions = {0.7, 0.7, 0.5};\ngeometry_msgs::Pose pose2;\npose2.position.x = 0.0;\npose2.position.y = -0.85;\npose2.position.z = 0.25;\npose2.orientation.w = 1.0;\ncollision_object2.primitives.push_back(Controller);\ncollision_object2.primitive_poses.push_back(pose2);\nplanning_scene_interface.applyCollisionObject(collision_object2);\nROS_INFO(\"Added Controller into the world\");\n\n\/\/ Create and add the third collision object\nmoveit_msgs::CollisionObject collision_object3;\ncollision_object3.header.frame_id = move_group.getPlanningFrame();\ncollision_object3.id = \"wall\";\nshape_msgs::SolidPrimitive wall;\nwall.type = shape_msgs::SolidPrimitive::BOX;\nwall.dimensions = {0.2, 2.0, 2.0};\ngeometry_msgs::Pose pose3;\npose3.position.x = -0.4;\npose3.position.y = 0.0;\npose3.position.z = 1.0;\npose3.orientation.w = 1.0;\ncollision_object3.primitives.push_back(wall);\ncollision_object3.primitive_poses.push_back(pose3);\nplanning_scene_interface.applyCollisionObject(collision_object3);\nROS_INFO(\"Added wall into the world\");\n\nros::spin();\nreturn 0;<\/code><\/pre>\n\n\n\n<p>}<\/p>\n\n\n\n<p>By adding these collision objects to the planning scene, the robot can plan its motion safely and avoid collisions with the objects in the environment.<br>III. To include the &#8221; add_collision_object.cpp &#8221; file in the CMakeLists.txt file within the visual_servoing package, by run follow these steps:<br>\u2022 Open a terminal and navigate to Catkin_ws package directory. Assuming package is named visual_servoing, you can use the following command:<br>$ cd \/path\/to\/catkin_ws\/src\/my_package<br>\u2022 Open the CMakeLists.txt file using a text editor of your choice. For example:<br>$ gedit CMakeLists.txt<br>\u2022 Inside the CMakeLists.txt file, you will find a section for adding source files to your package. Look for a line that starts with \u2018add_executable\u2019 and \u2018target_link_libraries\u2019.<br>add_executable add_collision_object src\/ add_collision_object.cpp)<br>target_link_libraries(add_collision_object ${catkin_LIBRARIES} ${Boost_LIBRARIES})<br>Adjust the path and filename according to the actual location of your add_collision_object.cpp file within your package.<br>\u2022 Rebuild your Catkin workspace using the \u2018catkin build\u2019 command:<br>$ Catkin build<br>This will compile package with the newly added source file.<\/p>\n\n\n\n<p>IV. Start the add_collision_object.cpp node in catkin_ws workspace, follow these steps:<br>\u2022 Build your Catkin workspace using the catkin_make command:<br>$ catkin build<br>This step is important to ensure that your package and its dependencies are compiled and built correctly.<br>\u2022 Start the add_collision_object.cpp node using the rosrun command:<br>$ rosrun visual_ servoing add_collision_object.cpp<\/p>\n\n\n\n<p>Figure 92 collision object warming<br>3.9 Combine program.<br>This Combine program serves the purpose of launching and coordinating multiple components in a RViz environment.<br>I. Create C++ program to write visual servoing system in \u2018src\u2019 in visual_servoing package folder. By run the following command:<br>$ cd ~\/catkin_ws\/src\/visual_servoing\/ launch<br>$ gedit visual_servoing.launch<\/p>\n\n\n\n<p>Each section corresponds to a different component or functionality required for a specific robotic system. It specifies the configuration and launch sequence for several components and nodes.<br><br><\/p>\n\n\n\n<pre class=\"wp-block-code\"><code>&lt;!-- Start Moveit ABB IRB120 model --&gt;\n&lt;arg name=\"sim\" default=\"false\" \/&gt;\n&lt;arg name=\"robot_ip\" default=\"192.168.125.1\" \/&gt;\n&lt;include file=\"$(find abb_irb120_moveit_config)\/launch\/moveit_planning_execution.launch\"&gt;\n    &lt;arg name=\"sim\" value=\"$(arg sim)\" \/&gt;\n    &lt;arg name=\"robot_ip\" value=\"$(arg robot_ip)\" \/&gt;\n&lt;\/include&gt;\n\n&lt;!-- Start Add collision object  --&gt;\n&lt;include file=\"$(find visual_servoing)\/launch\/add_collision_object.launch\" \/&gt;\n\n&lt;!-- Start Realsense Camera --&gt;\n&lt;include file=\"$(find realsense2_camera)\/launch\/rs_camera.launch\" \/&gt;\n\n&lt;!-- Start Aruco tracking --&gt;\n&lt;include file=\"$(find aruco_ros)\/launch\/single.launch\" \/&gt;<\/code><\/pre>\n\n\n\n<p>All node each section:<br>\u2022 Moveit ABB IRB120 model<br>This section launches the MoveIt configuration for an ABB IRB120 robot. It includes the moveit_planning_execution.launch file from the abb_irb120_moveit_config package.<br>\u2022 Add collision object<br>This section includes the add_collision_object.launch file from the visual_servoing package.<br>\u2022 Realsense Camera<br>This section launches the Realsense camera by including the rs_camera.launch file from the realsense2_camera package.<br>\u2022 Aruco tracking:<br>This section launches Aruco tracking by including the single.launch file from the aruco_ros package.<br>II. Start the visual_servoing.launch node in catkin_ws workspace, follow these steps:<br>\u2022 Build your Catkin workspace using the catkin_make command:<br>$ catkin build<br>This step is important to ensure that your package and its dependencies are compiled and built correctly.<br>\u2022 Start the visual_servoing.launch node using the rosrun command:<br>$ rosrun visual_ servoing visual_servoing.launch<\/p>\n\n\n\n<p>CHAPTER 4<br>Results<br>4.1 Flow chart<\/p>\n\n\n\n<p>Figure 93 Flow chart Visual servoing<br>\u2022 First, the system receives Camera data parameters from a camera sensor.<br>\u2022 Detected Aruco marker using camera data parameters and sent out geometry message and transfer into the goal pose in the robot arm.<br>\u2022 Receives a joints position from the robot sensor to start pose in the robot arm.<br>\u2022 Set start pose and goal pose in Moveti API to generate planning path from OMPL library. Check path collision before executing robot to goal pose.<\/p>\n\n\n\n<p>4.2 Results Visual servoing<\/p>\n\n\n\n<p>Figure 94 Visual servoing working<\/p>\n\n\n\n<p>Visual servoing is a technique used in robotics that utilizes visual feedback from cameras or sensors to control the motion of a robot.<br>I. To Run the Visual servoing in Ubuntu, you&#8217;ll need to follow these steps:<br>Open the terminal and run this command for open all node that combine program. The program will load the robot model, collision object environment and function camera sensor<br>$ roslaunch visual_servoing visual_servoing.launch<\/p>\n\n\n\n<p>Figure 95 Visual servoing working in program.<br>II. Start visual servoing<br>Run this command for start Visual servoing the camera sensor will send value to the robot&#8217;s for receiving data value<br>$ roslaunch visual_servoing robot_arm_controller.launch<\/p>\n\n\n\n<p>\u2022 The program initializes a ROS node and subscribe the aruco pose estimate geometry that use to define robot arm goal pose<\/p>\n\n\n\n<p>\u2022 When Received start pose from robot sensor and goal pose from sensor camera, it extracts the position and orientation values and assigns them to generate the planning path variables.<\/p>\n\n\n\n<p>\u2022 The program will plan and executes the motion of a manipulator robot to reach the goal pose using the MoveIt library.<\/p>\n\n\n\n<p>\u2022 If the program verifies that the path is in collision, then the robot will stop and send a warning message.<\/p>\n\n\n\n<p>Figure 96 Visual servoing with program<\/p>\n\n\n\n<p>CHAPTER 5<br>Conclusion<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img data-recalc-dims=\"1\" loading=\"lazy\" decoding=\"async\" width=\"586\" height=\"273\" src=\"https:\/\/i0.wp.com\/gulfthai.com\/wp-content\/uploads\/2023\/06\/image-11.jpg?resize=586%2C273&#038;ssl=1\" alt=\"\" class=\"wp-image-8485\"\/><\/figure><\/div>\n\n\n<p>5.1 Performance Summary<br>In conclusion, this report has presented a comprehensive investigation into the implementation of visual servoing techniques applied to an ABB robot simulation, with communication facilitated through ROS using IP. The integration of RViz and a camera sensor, coupled with the detection of Aruco markers, has enabled precise control and manipulation of the robot&#8217;s motion based on visual feedback.<br>Through this research, we have demonstrated the efficacy of visual servoing as a powerful method for robot control, offering advantages such as enhanced adaptability, real-time responsiveness, and improved accuracy. By utilizing the visual information captured by the camera sensor and employing the Aruco marker detection algorithm, the robot was able to autonomously adjust its position and orientation to reach desired targets with minimal error. RViz, as a powerful visualization tool provided by ROS, has served as a valuable interface for monitoring and controlling the robot&#8217;s behavior. Its intuitive interface and real-time feedback capabilities have greatly contributed to the overall success of the visual servoing system.<br>In summary, the integration of visual servoing techniques with an ABB robot simulation, ROS communication via IP, RViz control, and the utilization of a camera sensor with Aruco marker detection has demonstrated the potential for precise and adaptive robot control. The findings presented in this report contribute to the advancement of visual servoing methods and pave the way for their application in diverse domains such as industrial automation, robotic manipulation, and object tracking.<\/p>\n\n\n\n<p>5.2 Problems and suggestions<\/p>\n\n\n\n<p>Problems:<br>\u2022 The camera sensor value is sent later than the robot&#8217;s receiving data value, there have a delay movement.<br>\u2022 The connection between camera and controller USB unstable<br>Suggestions:<br>\u2022 Changing the Ubuntu base on the computer, rather than using Oracle VirtualBox, the result will more stable connection.<br>Future research endeavors should focus on refining the system&#8217;s performance by exploring advanced marker detection algorithms, addressing limitations related to lighting and occlusions, and investigating potential applications in real-world scenarios.<br>The ongoing progress in visual servoing technology promises significant contributions to the field of robotics, enabling more accurate and versatile control of robotic systems in various practical settings.<\/p>\n\n\n\n<p>References<br>Hoorn, G. v., 2023. [Online]<br>Available at: http:\/\/wiki.ros.org\/abb_driver\/Tutorials\/RobotStudio<br>Hoorn, G. v., 2023. ABB Experimental. [Online]<br>Available at: http:\/\/wiki.ros.org\/abb_experimental<br>Open Source Robotics Foundation, Inc. (OSRF), n.d. ROS documment. [Online]<br>Available at: http:\/\/wiki.ros.org\/noetic\/Installation<br>Ingvaldsen, M., 2019. The benefits of 3D hand-eye calibration. [Online]<br>Available at: https:\/\/blog.zivid.com\/importance-of-3d-hand-eye-calibration<br>Intel\u00ae RealSense&#x2122;, 2023. RealSense-ROS. [Online]<br>Available at: https:\/\/github.com\/IntelRealSense\/realsense-ros<br>Kalachev, O., 2022. Aruco Generator. [Online]<br>Available at: https:\/\/chev.me\/arucogen\/<br>PAL Robotics S.L., 2022. aruco_ros. [Online]<br>Available at: https:\/\/github.com\/pal-robotics\/aruco_ros<br>Raessa, M., 2022. Moveit Tutorials. [Online]<br>Available at: https:\/\/ros-planning.github.io\/moveit_tutorials\/<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Mr. Atthaphan Paksakunnee 6230350483 Mr. Tathipan Chaiwattanapan 6230340038 Mr. Bantoon Saengpairao 6230340054 03607499 Engineering Project for Robotics and Automation System IIIThis project is part of the Bachelor of Engineering curriculum.Program in Robotic and Automation Systems Engineering (International), Faculty of Engineering, Sriracha Kasetsart UniversityThe academic year 2023June 30, 2023Engineering project certificateRobotic and Automation System Engineering (International)Project<br \/><a class=\"read-more\" href=\"https:\/\/gulfthai.com\/?p=8469\">Complete Reading<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[5],"tags":[],"class_list":["post-8469","post","type-post","status-publish","format-standard","hentry","category-engineering"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/gulfthai.com\/index.php?rest_route=\/wp\/v2\/posts\/8469","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/gulfthai.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/gulfthai.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/gulfthai.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/gulfthai.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=8469"}],"version-history":[{"count":5,"href":"https:\/\/gulfthai.com\/index.php?rest_route=\/wp\/v2\/posts\/8469\/revisions"}],"predecessor-version":[{"id":8625,"href":"https:\/\/gulfthai.com\/index.php?rest_route=\/wp\/v2\/posts\/8469\/revisions\/8625"}],"wp:attachment":[{"href":"https:\/\/gulfthai.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=8469"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/gulfthai.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=8469"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/gulfthai.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=8469"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}