1 Introduction

An articulated robotic arm is a class of programmable mechanical arm with rotary joints which performs likewise to a human arm [1]. This robotic arm can accomplish manipulation tasks either automatically or by a human operator. While automatic arming is necessitated for the repetitive and continuous operation in a factory, the human-operated ones are required for manual operation. As the manual one is controlled from a distance, different models of communication technology (i.e. Bluetooth, Wi-fi) are in the catalog for this contrivance. This paper presents the kinematic modeling and mechanical design of a five Degrees of freedom (DoF) remote vision based manually operated robotic arm where the data is transferred over the wi-fi technology.

The motivation behind this task is the overwhelming demand for more compatibility and a greater degree of freedom. Thereupon, this metaphysics has led to the emergence of a teleoperated robotic manipulator in industrial, medical, and soldiery applications. Teleoperation of the robotic arm is now practiced to assist the doctors in the geographically separated area [2]. They are also intended to handle hazardous chemicals in industry [3, 4] as well as nuclear power plants [5] to lessen accidents. In the same instance, recent literature introduces the application of robotic arm in agriculture [6]. Excluding the task of grasping and handing, the robotic manipulator is also exercised for writing [7]. With the development of robotic technology stepping into the new era of automation and remote tasking, the implementation of robotic arm towards being the complex. Robotic arms are now intended for rehabilitation [8], renewable energy [9], and motion assistance [10]. Being a most effective member of an industry the robotic arms are intensively used in the food industry [11] and other industrial manipulation [12] tasks. An articulated robotic arm is also used as a robotic educational platform for autonomous object manipulation as introduces in [13].

This paper presents the design, kinematic modeling, and development of a robotic manipulator. The manipulator is a 5 DoF robotic arm along with a camera placed in the eye-in-hand configuration. The camera gives visual information to the base station which assists the operator to drive the robotic manipulator. The basic block diagram of the system is shown in Fig. 1. The instantaneous robot position is visually inspected by the operator, thus to avoid the complication in the calculation the dynamics are not considered in this study as a dynamic model will normally contain a number of inertia parameters per body. A graphical user interface (GUI) at the user end consists of a canvas of live video streaming and two buttons of “play” and “exit” that are used for running and pausing the streaming respectively.

Fig. 1
figure 1

Basic block diagram of the robotic system

For the remote operation, a joystick with an Arduino development board is employed at the user end. The Arduino reads the instantaneous X and Y position from an analog stick and the Z position from another stick where the X, Y, and Z denotes the user input. For opening and closing the parallel jaw gripper, two dedicated buttons are exercised. Accordingly, all the commands are forwarded to the PC with an encoding format. Henceforth, decoding the incoming bits are assessed to determine the angle of every limb with inverse kinematics. The manipulator is regulated with an Arduino that accepts the data from the client. The Arduino commands every limb of the robotic arm according to the sequence of command coming from the user end. The encoder is coupled with every joint axis for feedbacking the current position to the Arduino.

In the prior literature, experts research in the field of the robotic hand and its application. A neoteric survey in [14] presents that the robot plays a vital role in the modern manufacturing industry to be competitive. It is safe and useful to control the robots from a distance. Human operators also prefer a facile approach to control the robots directly for handling the objects. An accelerometer-based control of an industrial robotic arm using Bluetooth wireless technology is presented in [15]. The Bluetooth technology permits to communicate over a short distance. Meanwhile, there is no clear image of the loss function of the ANN training model. Again, operating this system needs a wearable control device which may not be comfortable for the user. In the later literature, a remote control scheme of a four DoF robotic arm is shown in [16] which uses an easy touchscreen user interface (UI). To put it differently, some research employs the haptic device [17] for controlling the movement of the manipulator. However, a neoteric user study in [18] presents that most of the users prefer the gamepad-based interface for its easy usability and efficiency. The communication method selection challenge comprises finding a higher data rate with an extended range of operations. A wireless robotic arm using Bluetooth and Zigbee connectivity is reported in [19, 20]. A comparative study on wireless protocol reported in [21] exhibits that the Wi-Fi protocol has a higher data transmission rate within a comparatively elongated nominal range.

The novelty of this work is to model and develop a remote vision-based robotic manipulator with a high-speed data communication rate. Lag free video streaming is a provocation for the remotely controlled robotic arm from a distance. In the case of the mechanical design, the workability, mobility, and the operability are taken into account. The robotic arm can efficiently and stably grave an object with its parallel jaw gripper. The kinematic modeling along with the simulation results and mechanical design is carried out. The practical implementation of the system is also shown in this paper.

The remainder of this paper is organized as follows: Section II explains the kinematic modeling of the robotic arm. The MATLAB simulation results of the developed kinematics equations are provided in section III. The mechanical design of the robotic arm is manifested in section IV. The discussion on data transfer scheme is conducted in section V. Section VI illustrates the experimental evaluation of the proposed model. A concluding part of the overall system is illustrated in section VII.

2 Kinematic modeling

The kinematic modeling and the coordinate establishment is the key point to achieve the harmonization between the robotic manipulator and the control device. Thereafter, the Denavit Hartenberg (DH) conception is performed where some homogeneous matrices of individual kinematic pairs are used to define the transformation of the coordinate. The DH coordinate frames are exhibited in Fig. 2 which comprises the layout of the manipulator’s joints, links and the orientations. From the figure, the four links of the manipulator are labeled as \(l_1, l_2, l_3,\) and \(l_4\) and their lengths are 15 cm, 44 cm, 37 cm and 25 cm respectively. The arm comprises five rotational joints which are labeled as \(J_1\), \(J_2\), \(J_3\), \(J_4\), and \(J_5\). These five joints work concurrently for gripping an object in a three-dimensional workspace and thereby acquire a high performance gripping.

Fig. 2
figure 2

Kinematic chain of 5 DoF manipulator

For analyzing the inverse kinematics, the DH parameters are listed in Table 1. From the table, \(\theta\) is the rotation of \(i-1\) frame around \(z_{i-1}\) axis that is required to get \(x_{i-1}\) axis to match \(x_i\) axis which also includes the rotation due to the joint; d expresses the distance from the center of \(i-1\) frame to the center of i frame along \(z_{i-1}\) axis; r indicates the distance from the centre of \(i-1\) frame to the centre of i frame along \(x_i\) axis; \(\alpha\) denotes the rotation of \(i-1\) frame around \(x_i\) axis that is required to get \(z_{i-1}\) axis to match \(z_i\) axis.

Table 1 Denavit Hartenberg parameter

According to the DH parameters, the transformation matrices of the links are represented in (1), (2), (3), (4) and (5).

$$\begin{aligned} H_{1}^{0}&= \begin{bmatrix} \cos \theta _1 &{} 0 &{} \sin \theta _1 &{} 0 \\ \sin \theta _1 &{} 0 &{} -\cos \theta _1 &{} 0 \\ 0 &{} 1 &{} 0 &{} l_1 \\ 0 &{} 0 &{} 0 &{} 1 \\ \end{bmatrix} \end{aligned}$$
$$\begin{aligned} H_{2}^{1}&= \begin{bmatrix} \cos \theta _2 &{} -\sin \theta _2 &{} 0 &{} l_2\cos \theta _2\\ \sin \theta _2 &{} \cos \theta _2 &{} 0 &{} l_2\sin \theta _2 \\ 0 &{} 0 &{} 1 &{} 0\\ 0 &{} 0 &{} 0 &{} 1 \end{bmatrix}\end{aligned}$$
$$\begin{aligned} H_{3}^{2}&= \begin{bmatrix} \cos \theta _3 &{} -\sin \theta _3 &{} 0 &{} l_3\cos \theta _3\\ \sin \theta _3 &{} \cos \theta _3 &{} 0 &{} l_3\sin \theta _3 \\ 0 &{} 0 &{} 1 &{} 0\\ 0 &{} 0 &{} 0 &{} 1 \end{bmatrix} \end{aligned}$$
$$\begin{aligned} H_{4}^{3}&= \begin{bmatrix} \cos \theta _4 &{} 0 &{} \sin \theta _4 &{} 0 \\ \sin \theta _4 &{} 0 &{} -\cos \theta _4 &{} 0 \\ 0 &{} 1 &{} 0 &{} 0 \\ 0 &{} 0 &{} 0 &{} 1 \end{bmatrix}\end{aligned}$$
$$\begin{aligned} H_{5}^{4}&= \begin{bmatrix} \cos \theta _5 &{} -\sin \theta _5 &{} 0 &{} 0\\ \sin \theta _5 &{} \cos \theta _5 &{} 0 &{} 0 \\ 0 &{} 0 &{} 1 &{} l_4\\ 0 &{} 0 &{} 0 &{} 1 \end{bmatrix} \end{aligned}$$

By multiplying these transformation matrices, the equations of the forward kinematics are obtained from (6).

$$\begin{aligned} H_{end-effector}^{base}=H_{5}^{0} = H_{1}^{0}H_{2}^{1}H_{3}^{2}H_{4}^{3}H_{5}^{4} \end{aligned}$$

The reference transformation matrix is provided in (7).

$$\begin{aligned} H_{5}^{0} = \begin{bmatrix} i_x &{} j_x &{} k_x &{} p_x\\ i_y &{} j_y &{} k_y &{} p_y\\ i_z &{} j_z &{} k_z &{} p_z\\ 0 &{} 0 &{} 0 &{} 1 \end{bmatrix} \end{aligned}$$

In where, for the proposed robotic manipulator:

$$\begin{aligned} i_x&=-s_1s_5+c_5c_1c_{234}&j_x&=-s_5c_1c_{234}-c_5s_1\\ i_y&=c_1s_5+c_5s_1c_{234}&j_y&= c_5s_1-s_5s_1c_{234}\\ i_z&= c_5s_{234}&j_z&=-s_5s_{234}\\ k_x&=-c_1s_{234}&p_x&= c_1\left( l_2c_2-l_4s_{234}+l_3c_{23}\right) \\ k_y&=-s1s_{234}&p_y&= s_1\left( l_2c_2-l_4s_{234}+l_3c_{23}\right) \\ k_z&=c_{234}&p_z&= l_1+l_2s_2+l_3s_{23}+l_4c_{234} \end{aligned}$$

The human operator inputs the expected position of the manipulator through a joystick. In order to move the robotic arm to that expected position, the actuators set the arms joint values. The values of joint angle \(\theta _1, \theta _2, \theta _3, \theta _4, and\) \(\theta _5\) are calculated by using (8), (9), (10), (11), and (13).

$$\begin{aligned} \theta _{1}&={{{\tan^{-1}}\,}}\frac{p_{y}}{p_{x}} \end{aligned}$$
$$\begin{aligned} \theta _{2}&={{{\tan^{-1}}\,}}\frac{n \left( l_2 + l_3c_3 \right) -ml_3s_3}{nl_3s_3+ m \left( l_2 + l_3c_3 \right) } \end{aligned}$$
$$\begin{aligned} \theta _{3}&={{{\cos^{-1}}\,}}\frac{m^2+n^2-l_2^2-l_3^2}{2l_2l_3} \end{aligned}$$
$$\begin{aligned} \theta _{234}&={{{\tan^{-1}}\,}}\frac{l_2c_2+l_3c_{23}-p_xc_1-p_ys_1}{p_z-l_1-l_2s_2-l_3s_{23}} \end{aligned}$$
$$\begin{aligned} \theta _{4}&= \theta _{234}-\theta _{2}-\theta _{3} \end{aligned}$$
$$\begin{aligned} \theta _{5}&={{{\tan^{-1}}\,}}\frac{\left( i_yc_1-i_xs_1\right) s_{234}}{i_ys_1+i_xc_1} \end{aligned}$$

where, in (9) and (10):

$$\begin{aligned} m&= p_xc_1+p_ys_1+l_4s_{234}\\ n&= p_z-l_1-l_4c_{234} \end{aligned}$$

3 Simulation of the robotic arm

After deriving the kinematic modeling of the robotic arm, it is later simulated in the MATLAB environment. A graphical user interface (GUI) is designed for simulating the five DoF robotic arm. A simulation window of the robotic arm on a 3D environment using MATLAB Robotics Toolbox is displayed in Fig. 3.

Fig. 3
figure 3

3-D simulation of robotic arm

In the GUI window, the user inputs the position coordinates of the end effector. From that, the individual joint values of the angles are calculated and displayed on the graph. Figure 4 shows the variation of joint angles for going from pose P1 \((100,0,-\,5)\) to P2 \((90,5,-\,20)\) against time. The system uses the Cartesian trajectory path planning for going from one pose to the another.

Fig. 4
figure 4

Simulation results for joint angle versus time

4 Mechanical modeling

This section illustrates the mechanical modeling of the five DoF articulated robotic arm. The mechanical scheme of the robotic arm has been led to the considered application, i.e. grasping an object and placing it on a distinct location. The requirements of design criterion comprise the competency to move, raise, lower and grasp an object while following the operator’s command. The system should also be susceptible to relocate an object without losing any grip. With all this in mind, the design is first drafted in the SolidWorks environment before conveying it to the fabrication lab.

Fig. 5
figure 5

Mechanical design of the robotic system

Figure 5 shows the computer-aided design of this robotic arm. Whole robotic arm setup is mounted on a robot base which can rotate through the vertical axis. A high torque DC geared motor is installed for rotating the large mass. For the rotation along the second and third joint axis, two linear actuators are used. Another two DC motors are included for creating rotation towards the forth and fifth joint axis. For the latter, GT2 timing belt of 5 mm width is at work for motion transmission. A rotary encoder is coupled with each axis shaft for feedbacking the joint value.

While designing the end effector of the robotic hand, the parallel jaw gripper is preferred. The significance of the parallel gripper is that the gripper’s jaws will lie in complete parallelism during all the opening and closing movement. Two parallel jaws of the gripper are placed on a nut-screw slider mechanism. The equal portion of the screw is threaded forward and backward. Gripper jaws are coupled with a nut and the screw is twisted on to the nuts. A DC motor is coupled with the screw with a timing belt-pulley mechanism. The motor rotates in one direction for the opening of the gripper and on the other direction for the closing. A close view of the robotic gripper with the additional setup is shown in Fig. 6.

Fig. 6
figure 6

Mechanical design for robot gripper

Fig. 7
figure 7

Developed system configuration

Figure 7 presents the constructed robotic system. The main body parts of the manipulator is made up with the hollow aluminum bar. It makes a balance within material expense and weight to come at a design with a less cost and as well as light. The jaws of the gripper are 3-D printed. Furthermore, the adoption of portable mechanical design has allowed the robot to replace any of its ingredients while damaged. The drawback of the manufactured system is the backlash due to the transmission mechanisms.

5 Data transfer scheme

For the communication system of the robot, wireless fidelity (Wi-fi) technique is utilized. The Wi-Fi transmission system (IEEE-802.11 standard) defines an over-the-air interface between client and server. The usual streaming model sends packets as quickly as possible. Among them, TCP protocol provides a quicker and better data rate. Figure 8 presents the working flowchart of the TCP connection. In like manner, a server and a client are generated for ensuring the threeway handshake communication of the system. The socket server programming is a reliable approach to interface between socket and server [22]. As there are one server and one socket in the argument, a dedicated channel is presented between two devices. The entire capacity of that link is kept for transmission between those two devices. Consequently, point-to-point(P2P) communication is set for implementing the service.

The server is designed in the base station and the client is in the robotic hand. Server connected to an Arduino not only receives the continuous data string from the user end but also gets acknowledgment from the robotic manipulator. The user end manipulates the robotic hand with a joystick controller designed with Arduino. Not only the server receives the data and sends to its client but also it waits for the acknowledgment. As the system requires only one spectator, no better communication type than unicast results the optimized output. So, here a wide-angle HD camera(Logitech HD Webcam) is practiced for apprehending the live data at 1080p. A huge data of the camera feed want a compression for lossless video streaming.

Fig. 8
figure 8

Block diagram for establishing TCP connection

For the compression purpose, the pixel matrix taken from the camera feeds is subtracted by 128. Now, for \(n\times n\) bit image, the two-dimensional discrete cosine transform (DCT) is calculated by using (14), where u is the horizontal spatial frequency, for the integers \(0 \le u < n\) ; and v is the vertical spatial frequency, for the integers \(0 \le v < n\). \(g_{x,y}\) and \(G_{u,v}\) denote the pixel value and the DCT coefficient at coordinates (u, v) accordingly. To obtain the orthogonal transformation, the normalizing factor \(\beta _{p} (n)\) is entitled.

$$G_{u,v} =\beta \left( u\right) \beta \left( v\right) \sum _{x=0}^{\tau }\sum _{y=0}^{\tau }g_{x,y}\cos [\pi \left( \frac{x}{8}+\frac{1}{16}\right) u]\cos \left[ \pi \left( \frac{y}{8}+\frac{1}{16}\right) v\right]$$
$$\begin{aligned} \beta _{p}(w) &=\begin{array}{ll} \sqrt{\frac{2}{16}}, \quad &{} \hbox {if } w = 0 \\ \sqrt{\frac{1}{4}}, &{} \hbox {otherwise} \\ \end{array} \end{aligned}$$

The quantized DCT coefficients are calculated from (15). The repeated process of encoding in a zigzag manner will be continuing until finding the zero.

$$\begin{aligned} B_{j,k} = round \left( \frac{A_{j,k}}{Q_{j,k}}\right) \end{aligned}$$

where \(Q_{j,k}\) is the quantization factor and \(A_{j,k}\) denotes the DCT coefficient of the kth block. For the video transmission purpose, TCP connection is installed as well. Henceforth, a server is created with a port number waiting for the client. Afterward, a client requests to the server with the port number for securing the three-way handshake. Then, the compression matrix is sent to the server after the matrix has been encoded. Meanwhile, another section receives the encoded message and visualize it to the governor with a decoded form.

6 Experimental evaluation

To evaluate the performance of the proposed system a robotic arm is developed as conferred in Fig. 7. In this section, the implementation and the demonstration of the developed system are exhibited with a manipulation task which involves grasping an object using visual camera feeds. The machine body of the system includes a total of six actuators, a camera, and driving circuitry. At the base station side, a Havit HV-G83 joystick is employed for obtaining the robot position from the user. And for data communication, D-Link DIR-615 wi-fi router is used at both sides. The specifications of the robotic arm are tabulated in Table 2. For ensuring a good contact and gripping, the inner surface of the gripper’s jaw is covered with rubber. Moreover, a rotary encoder is coupled with each joint axis for measuring the instantaneous joint angle. All the motors are mounted in place with a clamp and the linear actuators are mounted using the dual pivot mounting method. To heighten the perception capacity of the operator, the robotic arm is enhanced with feedback encoder and camera sensor that facilitate a human operator to monitor the states of the robot.

Table 2 Specification of the robotic system
Fig. 9
figure 9

Sequence of workability: a Camera feeds before grasping an object, b position of arm before grasping, c Camera feeds at the time of grasping, d Robot position at the time of grasping

For the workability demonstration, a cylindrical object is prepared and placed vertically on the base plane. Figure 9 shows the execution sequence for gripping the object. At first, the human operator receives camera feeds from a Logitech HD Webcam C270 mounted on the end-point. The viewpoint of the camera before gripping is shown on Fig. 9a. The instantaneous position of the manipulator from another viewpoint is shown in Fig. 9b. The operator then moves the joystick to reposition the robotic arm for grasping and send command for closing the end-effector. The camera feeds and robot position at grasping time is exhibited in Fig. 9c, and Fig. 9d. The setup of the base station is shown in Fig. 10.

Fig. 10
figure 10

Setup of the base station

The instantaneous position of each joint is obtained with the encoder mounted on the robot joint. Based on the operators’ given position (x, y, z), the angular value associated with each joint is calculated through the inverse kinematic equations. The position error is determined as the difference between the destination position and the real position. This error term is then fed into the arm controller to generate the control signal. Based on the control signal the inverse kinematics algorithm calculates the angular position. An experiment of moving the robotic arm from position \((100,0,-5)\) to position \((90,5,-20)\) is conducted in Figs. 11 and 12. In Fig. 11a, b, the users’ command at the beginning of the operation and the end of the operation is shown. The experimental tracing path of the robotic arm is illustrated in Fig. 12. From the figure, it is observed that the robotic arm traces the path in a stepping style. It is due to the mechanical latch present on the manufactured arm. Another cause behind its stepping style is the resolution of the encoder. In case of \(\theta _5\) the angle remains almost constant. As the \(\theta _5\) revolves around the horizontal x axis, its work is to rotate the gripped object. In the conducted experiment, the arm holds the object vertically and due to this reason the joint angle remains constant. The Euclidean distance between the initial and final position track by the robotic arm during simulation and the real study is calculated as:

$$\begin{aligned} d= \sqrt{\left( x_1-x_2\right) ^2+\left( y_1-y_2\right) ^2+\left( z_1-z_2\right) ^2} \end{aligned}$$

The distance calculated for the simulation, \(d_s\) is 18.70 unit and the distance calculated for the developed system, \(d_r\) is 19.21 unit. The distance error made by the robotic arm at the considered point is 2.73% as calculated using (18).

$$\begin{aligned} \%\; of\; error = \frac{\mid d_r-d_s \mid }{d_s}\times 100\% \end{aligned}$$

However, The instantaneous joint angles for five different joints shown in Fig. 12 ensures a desirable performance of the robotic arm.

Fig. 11
figure 11

Command pulse for the destination position a x and y position; b z position

Fig. 12
figure 12

Instantaneous joint angle for moving from position \((100, 0, -\,5)\) to position \((90,5,-20)\), a \(\theta\) 1; b \(\theta\) 2; c \(\theta\); d \(\theta\) 4; e \(\theta\) 5

7 Conclusion

This paper exhibits kinematic analysis, mechanical design, and practical implementation of a five DoF robotic manipulator for industrial application. Robot arms are now employed to assist in mass production industries. They are also implemented to appease the risk of injury for the laborers, and to take on the responsibilities of monotonous tasks. For facilitating the remote operation of the robotic arms, a teleoperated robotic system is manifested in this work. In this proposed framework, the movement of the robotic manipulator is controlled by a human operator where the operator can directly perceive the eye view of the workspace. The data communication between the control station and the robotic system is done through the wi-fi technology for faster actuation of the robotic arm. For successful manipulation of the robotic arm, the kinematic modeling is first derived and simulated in the MATLAB environment. This paper also addresses the SolidWorks design and specification of the hardware used in the implementation. The mechanical design is compact by virtue of portability. This added portability appends to the versatility of the designed model by the method stated in this work. The experimental demonstration of the suggested approach shows that this system can be viewed as a promising solution for different industrial applications.

The integration of the Internet of Things along with the autonomous object detection and data management scheme would be the possible future research direction of the proposed work.