RAKS Project
‘Robotic Arm using Kinect Sensing’
Rafed Al Doohan
Australian College of Kuwait
A thesis submitted for the degree of
Bachelor of Engineering Technology
May 2013
To the Engineering Department Degree
PROJECT PAPER APPROVAL
ROBOTIC ARM USING KINECT SENSING
By
Rafed Al Doohan
A Thesis Paper Submitted in Partial
Fulfillment of the Requirements
for the Degree of
B. Engineering Technology
in the field of Electronics and Control
Approved by:
Supervisor: Dr Ayad Salhieh - Head of School of Engineering
Director: Dr Hidab Hamwi - Electrical Engineering Lecturer
Engineering Department
Australian College of Kuwait
10-OCT-2012
Acknowledgements
I would like to thank Dr Hidab Hamwi and Dr Ayed Salhieh for their encouragement and guidance as my supervisors, Dr.Ehab, Mr.Yan, Mr.Robbin and Mr.John Florisco for all their advice and assistance and the colleague of mine Abdul Aziz Al-Kous for initiating this idea. I would also like to thank the Microsoft Kuwait for aiding me with resources in this project.
Abstract
Industry revolution is still expanding and hazards rate is increasing, so we need a robot to do this tasks or jobs in an accurate way. The invention of the RAKS is the solution and a perfect one too, because it can do and reach places that we need a lot of time and hours to get job well done. It is designed to resist high/cold temperature, and radiation places. RAKS system has four DOF (Degree Of Freedom) to move freely in a 180° angle for each DOF. Also, the gripper is designed to carry at max five KG’s.
The main aim of this paper is to show the incredible service that RAKS could give. In the design of a RAKS, the mechanical balance of the structure dose not represents the overall picture. In particular, a four wheels RC Car will carry the robotic arm which controlled by a two Ardiuno circuit. This arm is centered in the middle bulleted with three screws, with the appropriate calculation of dynamics and statics we should be able to calculate how much mass could the RC Car can carry. Furthermore, our main point is using Microsoft KINECT sensing to move this arm, so we need to connect our KINECT device to the Ardiuno and start programming from there.
Most of the robots these days is in one room/position and controlling them using switches or PLCs or CNC units or etc…, so RAKS is for multi-purposes things, like carrying, pushing and also fixing or doing maintenance. Typical, application includes; Palletizing, warehouse loading/unloading, mill and machine tool tending, parts sorting, packaging. Moreover, we have advanced robots which are used as prototypes and even more complex than a normal building of an arm in the factories.
I have made the RAKS by five steps, first is the design phase, second is mechanical building, third is the embedded programming, fourth is the KINECT programming, fifth and last is connecting phase which is the most challenging part in this system. It’s simply how to put parts together and make it function without using any sophisticated devices or equipments. So, it’s a matter of time where put it all together and do more implementation on this system. In a nut shell the project is for both majors; Electronics and Mechanical students.
I have gained a lot of information, on how to get the mechanical parts? How to program in C and C#? How to control a servo such as SAVOX servo? How to integrate the human motions into digital pulses or signals? How to convert between programming languages? How to use CAD skill to enable to design
Contents
2.1. The RAKS Base Building
2.2. Building the ARM
3.1. Ardiuno Connection
3.2. Ardiuno Code
3.3. C# with Ardiuno
4.1. C# and WPF
4.2. Depth Code
4.3. Skeleton Code
4.4. Kinect to Ardiuno
References
Appendix
List of Figures
Figure 1: Robotic Arm Toy [8] 3
Figure 2: Water Jet Cutting machine [9] 4
Figure 3: RAKS BASE OVERVIEW 4
Figure 4: RAKS Base Front view 5
Figure 5: RAKS Base side view 5
Figure 6: Gripper Base 3D view 6
Figure 7: Gripper Joint to the ARM 6
Figure 11: Gripper F & G & H Parts 8
Figure 14: Gripper O & P Parts 10
Figure 16: Arm Part A Joints 11
Figure 22: Arm Part A Base_1 13
Figure 23: Arm Part A Base_2 14
Figure 24: Arm Part A Base_3 14
Figure 25: Arm Part A Base_4 14
Figure 26: Arm Part A Base_5 15
Figure 30: Ardiuno USB Cable 17
Figure 31: Ardiuno UI [15] 17
Figure 32: Ardiuno MEGA 2560 [Appendix: Ardiuno Mega 2560] 18
Figure 33: Ardiuno Circuit with the servos [16] 18
Figure 35: The Kinect for windows SDK architecture [21] 22
Figure 36: Kinect Cable 23
Figure 37: WPF Data binding [11] 24
Figure 39: RAKS Arduino Function 30
Figure 40: Inside SAVOX Servo Motor [12] 31
Figure 41: Inside Servo Motor [13] 31
Figure 42: Variable Pulse width control servo position [14] 32
Figure 43: RAKS Circuit Simulator [21] 33
Figure 44: RAKS Circuit PCB[21] 33
Figure 45: Circuit Schematic[21] 34
Figure 47: RAKS ARM & PART C 35
Chapter 1
Introduction
One example is Robot-building students at Stanford University have created a sword-fighting robot arm with the help of a Kinect sensor [11]. The robot reacts to the actions of an opponent in both offensive and defensive ways. Another example is Gambit, a chess-playing robot whom Intel Labs Seattle researchers and students from the University of Washington have developed; it interacts with human chess [12]. Also a perfect example to our system here is the three University of Minnesota students are using a Kinect sensor to remotely control a robotic arm, but it’s not as simple as it sounds [7]. I think that these examples should be enough to say that the RAKS is one of them and will be in the future a manufacture system where factories are using it to solve their problems. The RAKS is more controllable, accurate and getting job well done.
This study was approved by the engineering department in the Australian College of Kuwait and Microsoft in Kuwait. Because this study has been viewed several times by mentors and employee in Microsoft here in Kuwait and they both gave recommendation this is should work.
Why do you need a safety inspector if you have RAKS? We can easily avoid danger and stop hazards in unreachable places and do things like wilding and cutting without living your office. The motion control system of the mechanical structure. The problem of the robot manipulator control is to find the time behavior of the forces and torques to be delivered by the joint actuators. A feedback control system is capable of satisfying accuracy requirements on the execution. Control of a mobile robot substantially differs from the analogous problem for the robot manipulators. If a manipulation task requires interaction between the robot and the environment, the control problem should account for the data provided by the sensors; the forces exchanged at the contact with the environment and the objects' position as detected by suitable cameras. Also we need a force control and visual control techniques.
One of our IO board is Ardiuno which is recently is used by engineers to avoid coding problem and easy in troubleshooting. This development environment has the perfect solution for the mind of our RAKS movement. We use Microsoft development environment like visual studio 2010. In a net shell, we are using a C# language to develop a WPF application. We are dealing with a port that data go in a serial of 0 and 1 which activate a specific pin on the Ardiuno. Classic engineering relies on a strict process from A to B, instead you find C path for another solution of your problem. So, prototyping is the Ardiuno way to find an easy path to get to your goal [4].So we don’t have to build a program from scratch, you just need some modification.
The hope of creating an AI world is an old dream of ancient Greeks, Arabs, Leonardo Da Vinci, and Nicola Tesla. Modeling, planning, and control are the early stages for building a full stretchered robot. The relation between joint velocities, end-effector, and angular velocities is described as geometry presentation which is called kinematics. The motion is described in simulation, manipulation and algorithms. All of these calculations are done first and then building is super easy. [1]
A programmable robot is a foundation of building a robot. The ultimate level is letting this robot move and thinks by itself. In this thesis paper we will show how easy we could make one robot part and how to control it. However, if you need to build something like Toshiba Robot, or like NASA or maybe Iron Man like in the movie. You should have a full team who aid you with their knowledge in Mechanics, Mechatronics, Software, Electronics and draft Engineers. In this paper I used my basic knowledge to of electronics and I have some skill in CAD design. It works but with a lot of help from others. This paper is to complete the revolution of designing and creating new invention such as the RAKS.
Programming or finding the rightful code for your robot isn’t the only obstacle that I faced, also the robot dynamic and the logic of the motion of it, because I create the RAKS from Aluminum. Referring this phrase to my diploma model robotic arm. Also, I used in this project servo motors and will discuss in this paper what is the function is of the servo and we will compare them with other motors, like step motor and dc motors. We will see how to test this into the bread board, and then we will solder it to a copper based board and connect the Ardiuno to it.
In my design and implementation of RAKS, I categorize this project into five phases; well I didn’t do them in order to be honest. But I tried my best to finish them on time. They are, Base Building, Circuit Design, Building the Arm, Kinect Programming, Ardiuno Programming and the Connecting phase.
Chapter 2
RAKS Design Phase
As we mention in the introduction, we need to build a Robotic Arm but in a simple form. It took me two courses of designing and imagining it how it will look like. I started looking my project as a toy.
Figure 1: Robotic Arm Toy [8]
I tried to copy the same functionality but I failed. Because I have the weight problem, the mechanism is so difficult to implement in a workshop and costly to begin with getting the pieces together. So I have match up the joints with a line. I have notice that if I take a cross sectional area, like cutting into two equal half’s without the gripper. We will see that our Arm is there and with a simple mechanism and less in weight. So I got this idea and tried to implement it using CAD in 2D, and then converting the arm into 3D to see how it looks like.
Some parts are done using the lath machines, and simple tools in the workshop. Other parts are made by workers in Aluminum, but the arm major parts are done using water jet cutting machine.
Figure 2: Water Jet Cutting machine [9]
I divide this section into two, one is for the base and other is for the Arm itself.
So the base is simply an RC Car where I bought from Friday market and I removed the body and mount a piece of compressed wood using Aluminum on both sides. [Appendix: RC Car]
Figure 3: RAKS BASE OVERVIEW
In the Figure3, the rectangle shape can spread the weight of the arm, from the middle to all the four corners. The dimensions of the rectangle base; 295mm x 180mm x 5mm. The height from the ground should be around 115mm with three screws to hold the base. The holes are already there, they are used to hold the cover of the car.
We also notice that the back and front springs aren’t looked and they are not pressed so we are in the safe condition and the car run under a normal speed without putting a lot of energy.
Figure 4: RAKS Base Front view
In Figure 4, the tires are wide which lets the friction more and get the car more stable move along a straight line. Also the height is perfect and balanced for the overall car body.
Figure 5: RAKS Base side view
As shown in Figure 5, this Base is build for flexible weight control, also the movement of the arm if it is stretched wont effect the car movement. We will see in the mechanical part section how we have done calculations. If there is a problem with the car condition to collapse, we will solve it by hanging in the end of the car some metal weights.
The arm has 23 pieces from the base till the gripper. Imagine how it’s a complicated movement. It’s great when you make something of you own. Anyway, we will view all parts here and we will comment on each and how it will be functioning accordingly. I begin with gripper design then I started creating my arm.
I have taken this idea from well designed robot without using gears, also without any complicated mechanism. [6]
Figure 6: Gripper Base 3D view
In figure 6, this is the base where all the parts will be mounted together. Also, the servo will be there to move the arm. We will be mentioning the joint where it will connect to the arm.
Figure 7: Gripper Joint to the ARM
In figure 7, the same screw will go throw the four holes to hold the join and the servo. All of them is a 3mm diameter and have a net. So, four screws and one with same diameter drilled to the arm. Moreover, the other parts are needed to complete the arm design. First, I bought a 2m of aluminum metal beam then I start cutting like the following CAD Diagram
Figure 8: Gripper L&M Part
Figure 8 shows two identical parts they are suppose to be under the base. Let us see the other parts.
Figure 9: Gripper N Part
Figure 9 shows the mounted place on the servo with three screws. it will be attached to one of the parts in Figure 8. Like so;
Figure 10: Gripper K Part
Figure 10 is a part took a while till I reach to the accurate size between Figure 8 and Figure 9. Next, the part is attached to Figure 8 is H.
Figure 11: Gripper F & G & H Parts
The F & G parts are attached to each other as shown in Figure 11 but on the other side of the base. Also the M part from figure 8 is attached on the other side but with a rotation of a 180°. Now we will show the part that is beside L & H.
Figure 12: Gripper I Part
Figure 12 is very critical because it’s attached to the figure 13 using 3mm bolts.
Figure 13: Gripper J Part
Furthermore, these are the final pieces of the gripper which is shown Figure 14.
Figure 14: Gripper O & P Parts
Any mistake in figure 12 and figure 13 will let the motion more difficult on the arm. It will move but in a though way. Next is the full sketch of the gripper.
Figure 15: RAKS Gripper
Figure 15 need modifications on part O & P. like bending them or cutting and making them as an angle of 90 degree so it will be like two real fingers. Now we will display the other parts of the arm. They are as categorized as A, B and C. We will show them and comment on Figure 16 and Figure 21.
Figure 16: Arm Part A Joints
Figure 16 going to need four of these pieces with a red arrow. The following parts come to gather first, and then attached to the long one.
Figure 17: Arm Part A_1
Figure 17 will be the elbow part, where the arm will be stand on it.
Figure 18: Arm Part A_2
We need four of Figure 18 like we said to build in the next part.
Figure 19: Arm Part_3
We attach Figure 19 to Figure 20 using two pieces of Figure 18.
Figure 20: Arm Part_4
Figure 21 rectangle gap will be for the savox motor. You will need some filling to get it fit in, it so tight. So second we will move on to part A, but the Base.
Figure 21: Arm Part A Base
Figure 22 first part goes on the top of the previous section using the following screw net and a long screw throw four corners. I didn’t include the four holes in the design because it’s easy to drill with 10mm offset. So after that we mount the savox servo and go to the next part.
Figure 22: Arm Part A Base_1
Figure 23: Arm Part A Base_2
We mount this part above the servo circle like Figure 23, then screw them with five 3mm diameter screw. Next comes;
Figure 24: Arm Part A Base_3
I forgot to include in figure 23 the four holes at ends of figure 24. So you need more four drilling holes. Then,
Figure 25: Arm Part A Base_4
Again, you will need four from figure 18. But just two to join figure 25 to figure 24. We should be accurate in the next part because any mistake could cause the savox servo won’t fit in required space.
Figure 26: Arm Part A Base_5
I flipped the image like in Figure 26, so you can see and imagine the shape in a better way. So now Part A is complete and we are going to continue to part B.
Figure 27: Arm Part B
Figure 27 going to be attached to Figure 20. And we mount the next savox servo to Figure 27. Going to the last part of the arm which is C.
Figure 28: Arm Part C
This part will be attached to part B savox servo. Also we will see this CAD sketches in Chapter 7. So we have the overall shape
Figure 29: RAKS CAD DESIGN
Chapter 3
Ardiuno Programming
We will discuss in this section how to start a program in Ardiuno and link your Ardiuno to the PC to deal with it. What so special about this device is easy to handle and simple to give a command to it. Our project is simply highlight a couple of areas in the Ardiuno programming, which is the Serial Port ”COM”, how to connect the Ardiuno with a C# and is the right voltage is going to the servo.
So let us begin a small introduction on how to begin this. First we are using the Ardiuno Mega 2560 to work with it in this project [Appendix: Ardiuno Mega 2560]. We need a USB Cable and a PC. Of course to configure the Ardiuno at first time you will have to see what is your OS to start configure it. So the type of the USB cable should be like the one for the printers of HP as in Figure 30.
Figure 30: Ardiuno USB Cable
Moreover, we have to download the UI for the Ardiuno to give it commands and upload sketches [10]. Figure 31 shows the UI that sketches could be found as a ready programs in the File > examples. Make sure you have the right software, otherwise it won’t work.
Figure 31: Ardiuno UI [15]
Figure 32: Ardiuno MEGA 2560 [Appendix: Ardiuno Mega 2560]
In our case we are going to use the Ardiuno to assemble the circuit for five servos, so illustrate it as follows;
Figure 33: Ardiuno Circuit with the servos [16]
We will have more detail in Chapter 6 how it is done. After we have made these connections we see that the Ardiuno is connected to our PC and through a certain port we control it using the commands that are given for each part. In Figure 31 we notice that the function void setup {} that we enter the pins and define the variables above it. Also we try to gather the void loop {} functions that are going to go on and on till an infant number. In a addition we should include a delay of 100 ms so we can see what is really happening in a slow motion process. We don’t need an extra voltage source so we will have a 5 v enough to move these servos. Recall the phrase I have burned my Ardiuno UNO because of the extra voltage [Appendix: Ardiuno UNO].
First of all we must know how the Arduino interacts with our PC. The Arduino Mega 2560 has a certain pins which allow digital pluses to tell the servo to which position and speed of turning, we explain in brief how the servo works in chapter 5. I used a code of an Arduino where he only connects two servos, in my case there is five [19]. So I bought a bread board and start connecting as shown in Figure 38.
In my case I am using port COM 12 to interact with my PC and see what I get read using visual studio. But first I need to use the UI like in Figure 31 and create new project. [Appendix: Arduino Code]And I will begin with defining my library that I am using which is <Servo.h>. Then as usual programming method we declare variables setting up my servo names and I also mark that on the servos that I had.
I go to the main setup function of the Arduino and set the brad rate to 9600 bps according to my board setting. Then we attach the servos to pin 3,4,5,6, and 10. After that we let each servo to go the specific angle. In our case we concentrate on 90 degree for each. [17]
Furthermore, we define unsigned charters and an integer with a value of 0. Now in the loop function we can see the Arduino waits a certain values from your computer. It is defined as five serial input and same number to the output. So in our case we have one serial or you can assign more serials according to your logic of programming. [18]
Now the flow control begins, you know if you receive a value you should reply back. This is the major concept of the idea here. If the output of the Arduino is receiving a specific value from your PC then the servo will move to a new place which is defined as 180- (the angle that is input in). Nest the loop by assigning the servo position if condition to each value of the out and in serial.
This is where the fun part begins, As if the C# is suitable for own skill, because you can use VB.NET instead. So I will begin with C# code here but before that I need show how we get our Ardiuno to our Visual Studio 2010.
First we download Visual Micro program the go on with installing it on your PC, then run the installation file, after that configure your Ardiuno by going to Tools>Ardiuno>Programmers. Then you look for a file called arduino.exe. The folder that you installed in section 3.1 should have it, or just write the down and wait till visual studio configure it. Finally, you can open a new project and begin programming or you can use an old sketch and begin coding from there.
I have used several examples one of them is controlling the servo using the curser bar. I just took the concept and implement the RAKS code.
Now this is another example on how to react using C# in terms to interact with board of the Arduino. So the following is code:
#include <Servo.h>
Servo myservo; // create servo object to control a servo
// a maximum of eight servo objects can be created
void setup()
{
myservo.attach(9);
// attaches the servo on pin 9 to the servo object and sets the rotation to 0
myservo.write(0);
}
int pos1= 0;
int pos2= 0;
int pos3= 0;
int totalMove = 0;
void loop()
{
if (Serial.available() > 0 && totalMove > 0)
{
pos1 = Serial.read() - '0';
// pos2 = Serial.read() - '0';
// pos3 = Serial.read() - '0';
// totalMove = ((pos3) + (pos2*10) + pos1*100);
myservo.write(pos1);
}
}
The C# should be something like this:
public void moveServo()
{
if (!serialPort1.IsOpen)
{
Console.WriteLine("Oops");
serialPort1.Open();
return;
}
serialPort1.DtrEnable = true;
serialPort1.DataReceived +=
new System.IO.Ports.SerialDataReceivedEventHandler(
serialPort1_DataReceived);
serialPort1.Write(new byte[] {57}, 0, 1);
}
This should be a CS file using WPF application,
Chapter 4
Kinect Programming
The innovative technology behind Kinect is a combination of hardware and software contained within the Kinect sensor accessory that can be added to any existing Xbox 360. The Kinect sensor is a flat black box that sits on a small platform, placed on a table or shelf near the television you're using with your Xbox 360. Newer Xbox 360s have a Kinect port from which the device can draw power, but the Kinect sensor comes with a power supply at no additional charge for users of older Xbox 360 models. For a video game to use the features of the hardware, it must also use the proprietary layer of Kinect software that enables body and voice recognition from the Kinect sensor. [23]
Figure 34: KINECT [22]
Now will see how the SDK computer and complete tracking of the movement in a sequence of streams. It shows in Figure 34 that it has 3 sensors, Infrared, depth and skeleton. In an addition to that we have sound recondition sensor. [3]
So we have the video stream, depth stream and audio stream. Each have a frame getting data as x, y, z coordinates on your screen. Next step is how to display the Kinect data? The answer is by WPF development environment.
There so many ways to program. I just need to match some codes and get them together and made my own code. Now we open visual studio 2010 and select new project, but be sure your using C#. After that select WPF Application, then insert your reference as Microsoft.Kinect.dll file. And you begin coding but before that make sure your Kinect is connected using the following cable:
The Kinect will confirm that there is power going on, with starts blinking. We at first going to apply our codes in several steps. I include the title controlling method to make it more interesting I add a control panel for it only. Also, we will see in the next sections how does it works with the depth and skeleton and also using depth. Also we add which COM we are in and how the data is send.
4.1. C# and WPF
Windows Presentation Foundation (WPF) is a next-generation presentation system for building Windows client applications with visually stunning user experiences. With WPF, you can create a wide range of both standalone and browser-hosted applications. WPF exists as a subset of .NET Framework types that are for the most part located in the System.Windows namespace. If you have previously built applications with .NET Framework using managed technologies like ASP.NET and Windows Forms, the fundamental WPF programming experience should be familiar; you instantiate classes, set properties, call methods, and handle events, all using your favorite .NET Framework programming language, such as C# or Visual Basic.[11]
WPF offers additional programming enhancements for Windows client application development. One obvious enhancement is the ability to develop an application using both markup and code-behind, an experience that ASP.NET developers should be familiar with. You generally use Extensible Application Markup Language (XAML) markup to implement the appearance of an application while using managed programming languages (code-behind) to implement its behavior. [11]
XAML is an XML-based markup language that is used to implement an application's appearance declaratively. It is typically used to create windows, dialog boxes, pages, and user controls, and to fill them with controls, shapes, and graphics. [11]
The main behavior of an application is to implement the functionality that responds to user interactions, including handling events (for example, clicking a menu, tool bar, or button) and calling business logic and data access logic in response. In WPF, this behavior is generally implemented in code that is associated with markup. This type of code is known as code-behind. The following example shows the code-behind and updated markup from the previous example. [11]
Most applications are created to provide users with the means to view and edit data. For WPF applications, the work of storing and accessing data is already provided for by technologies such as Microsoft SQL Server and ADO.NET. After the data is accessed and loaded into an application's managed objects, the hard work for WPF applications begins. Essentially, this involves two things: [11]
To simplify application development, WPF provides a data binding engine to automatically perform these steps. The core unit of the data binding engine is the Binding class, whose job is to bind a control (the binding target) to a data object (the binding source). This relationship is illustrated by the following figure. [11]
Figure 37: WPF Data binding [11]
The Media Element control is capable of playing both video and audio, and it is flexible enough to be the basis for a custom media player.
Windows Presentation Foundation (WPF) application that includes the elements that are common to most WPF applications: Extensible Application Markup Language (XAML) markup, code-behind, application definitions, controls, layout, data binding, and styles.
Not much new here since tutorial #2. In InitializeKinect() we enable the DepthStream and and listen to the DepthFrameReady event instead of ColorStream and ColorFrameReady as we did when getting the RGB image, the code below shows how to inilize the Kinect;[24]
private bool InitializeKinect()
{
kinectSensor.DepthStream.Enable(DepthImageFormat.Resolution640x480Fps30);
kinectSensor.DepthFrameReady += new EventHandler<DepthImageFrameReadyEventArgs>(kinectSensor_DepthFrameReady);
try
{
kinectSensor.Start();
}
catch
{
connectedStatus = "Unable to start the Kinect Sensor";
return false;
}
return true;
}
Now, here we get the depth data from the device in millimeter, and convert it into a distance we can use for displaying a black and white map of the depth. The Kinect device got a range from 0.85m to 4m (Xbox, the PC-version can see closer and further). We can use this knowledge to create a black and white image where each pixel is the distance from the camera. We might also get some unknown depth pixels if the rays are hitting a window, shadow, and mirror and so on (these will have the distance of 0). [24]
First of all, we grab the captured DepthImageFrame from the device. Then we copy this data and convert the depth frame into a 32bit format that we can use as the source for our pixels. The ConverDepthFrame function convert’s the 16-bit grayscale depth frame that the Kinect captured into a 32-bit image frame. This function was copied from the Kinect for Windows Sample that came with the SDK. [24]
Let’s take a look at the code.
void kinectSensor_DepthFrameReady(object sender, DepthImageFrameReadyEventArgs e)
{
using (DepthImageFrame depthImageFrame = e.OpenDepthImageFrame())
{
if (depthImageFrame != null)
{
short[] pixelsFromFrame = new short[depthImageFrame.PixelDataLength];
depthImageFrame.CopyPixelDataTo(pixelsFromFrame);
byte[] convertedPixels = ConvertDepthFrame(pixelsFromFrame, ((KinectSensor)sender).DepthStream, 640 * 480 * 4);
Color[] color = new Color[depthImageFrame.Height * depthImageFrame.Width];
kinectRGBVideo = new Texture2D(graphics.GraphicsDevice, depthImageFrame.Width, depthImageFrame.Height);
// Set convertedPixels from the DepthImageFrame to a the datasource for our Texture2D
kinectRGBVideo.SetData<byte>(convertedPixels);
}
}
}
Notice that we didn’t manually create a Color-array as we did in the previous tutorial. You could have used this method instead of the Color-array method as well. Just wanted to show a few ways to do this just in case you need better control[24]
And the ConvertDepthFrame function:
// Converts a 16-bit grayscale depth frame which includes player indexes into a 32-bit frame
// that displays different players in different colors
private byte[] ConvertDepthFrame(short[] depthFrame, DepthImageStream depthStream, int depthFrame32Length)
{
int tooNearDepth = depthStream.TooNearDepth;
int tooFarDepth = depthStream.TooFarDepth;
int unknownDepth = depthStream.UnknownDepth;
byte[] depthFrame32 = new byte[depthFrame32Length];
for (int i16 = 0, i32 = 0; i16 < depthFrame.Length && i32 < depthFrame32.Length; i16++, i32 += 4)
{
int player = depthFrame[i16] & DepthImageFrame.PlayerIndexBitmask;
int realDepth = depthFrame[i16] >> DepthImageFrame.PlayerIndexBitmaskWidth;
// transform 13-bit depth information into an 8-bit intensity appropriate
// for display (we disregard information in most significant bit)
byte intensity = (byte)(~(realDepth >> 4));
if (player == 0 && realDepth == 0)
{
// white
depthFrame32[i32 + RedIndex] = 255;
depthFrame32[i32 + GreenIndex] = 255;
depthFrame32[i32 + BlueIndex] = 255;
}
else if (player == 0 && realDepth == tooFarDepth)
{
// dark purple
depthFrame32[i32 + RedIndex] = 66;
depthFrame32[i32 + GreenIndex] = 0;
depthFrame32[i32 + BlueIndex] = 66;
}
else if (player == 0 && realDepth == unknownDepth)
{
// dark brown
depthFrame32[i32 + RedIndex] = 66;
depthFrame32[i32 + GreenIndex] = 66;
depthFrame32[i32 + BlueIndex] = 33;
}
else
{
// tint the intensity by dividing by per-player values
depthFrame32[i32 + RedIndex] = (byte)(intensity >> IntensityShiftByPlayerR[player]);
depthFrame32[i32 + GreenIndex] = (byte)(intensity >> IntensityShiftByPlayerG[player]);
depthFrame32[i32 + BlueIndex] = (byte)(intensity >> IntensityShiftByPlayerB[player]);
}
}
Return depthFrame32;
}
What this function does is to convert the 16-bit format to a usable 32-bit format. It takes the near and far depth, and also the unknown depth (mirrors, shiny surfaces and so on) and calculates the correct color based on the distance. This function requires a few variables. You can change the function so these are defined within if you want. [24]
// color divisors for tinting depth pixels
private static readonly int[] IntensityShiftByPlayerR = { 1, 2, 0, 2, 0, 0, 2, 0 };
private static readonly int[] IntensityShiftByPlayerG = { 1, 2, 2, 0, 2, 0, 0, 1 };
private static readonly int[] IntensityShiftByPlayerB = { 1, 0, 2, 2, 0, 2, 0, 2 }
private const int RedIndex = 2;
private const int GreenIndex = 1;
private const int BlueIndex = 0;
In the Window_Loaded event, initialize the runtime with the options you want to use. For this example, set RuntimeOptions.UseSkeletalTracking to receive skeletal data and register for the SkeletonFrameReady event.[25]
nui.Initialize(RuntimeOptions.UseSkeletalTracking);
nui.SkeletonFrameReady += new EventHandler<SkeletonFrameReadyEventArgs>(nui_SkeletonFrameReady);
void nui_SkeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e)
{
SkeletonFrame allSkeletons = e.SkeletonFrame;
//get the first tracked skeleton
SkeletonData skeleton = (from s in allSkeletons.Skeletons
where s.TrackingState == SkeletonTrackingState.Tracked
select s).FirstOrDefault();
}
A Joint position returns X,Y,Z values as explained below
Scaling a Joint value
Given that the X and Y positions are distance measurements, we can use the Coding4Fun Kinect Toolkit ScaleTo method to scale the value to a maximum X and Y position as shown below. [25]
Joint HandRight = skeleton.Joints[JointID.HandRight].ScaleTo(640, 480);
Joint HandRight = skeleton.Joints[JointID.HandRight].ScaleTo(640, 480, .5f, .5f);
To move the ellipses in our MainWindow to the location of a Joint, we will use the method below that sets the Canvas.Left and Canvas.Top position to the X (Left) and Y (Top) value from the Joint parameter. [25]
private void SetEllipsePosition(FrameworkElement ellipse, Joint joint)
{
var scaledJoint = joint.ScaleTo(640, 480, .5f, .5f);
Canvas.SetLeft(ellipse, scaledJoint.Position.X);
Canvas.SetTop(ellipse, scaledJoint.Position.Y);
}
SetEllipsePosition(headEllipse, skeleton.Joints[JointID.Head]);
SetEllipsePosition(leftEllipse, skeleton.Joints[JointID.HandLeft]);
SetEllipsePosition(rightEllipse, skeleton.Joints[JointID.HandRight]);
To see the difference between using and not using TransformSmoothing, toggle the true/false TransformSmooth property. There are two ways to use TransformSmoothing, you can set it to true and it will use a default set of parameters, or you can customize and experiment with each of the parameters to find the parameters that work best for your application. To do that, you’ll need create the TransformSmoothParameters struct yourself and define the parameters. [25]
private void SetupKinect()
{
if (Runtime.Kinects.Count == 0)
{
this.Title = "No Kinect connected";
}
else
{
//use first Kinect
nui = Runtime.Kinects[0];
//Initialize to do skeletal tracking
nui.Initialize(RuntimeOptions.UseSkeletalTracking);
//add event to receive skeleton data
nui.SkeletonFrameReady += new EventHandler<SkeletonFrameReadyEventArgs>(nui_SkeletonFrameReady);
//to experiment, toggle TransformSmooth between true & false
// parameters used to smooth the skeleton data
nui.SkeletonEngine.TransformSmooth = true;
TransformSmoothParameters parameters = new TransformSmoothParameters();
parameters.Smoothing = 0.7f;
parameters.Correction = 0.3f;
parameters.Prediction = 0.4f;
parameters.JitterRadius = 1.0f;
parameters.MaxDeviationRadius = 0.5f;
nui.SkeletonEngine.SmoothParameters = parameters;
}
}
I have made my own algorithm and state the overall process from the Arduino to the RAKS to move in the following Figure;
Figure 38: RAKS Algorithm
We it seems easy at it looks, but we still missing a part with is the Arduino talks to our PC. We describe it before but we will mention it in a diagram this time. The figure below shows the Arduino interactions.
Figure 39: RAKS Arduino Function
Chapter 5
Servo Motors
A servomotor is a rotary actuator that allows for precise control of angular position. It consists of a motor coupled to a sensor for position feedback, through a reduction gearbox. It also requires a relatively sophisticated controller, often a dedicated module designed specifically for use with servomotors.
Figure 40: Inside SAVOX Servo Motor [12]
The system of the servo motor is an automatic device that uses error-sensing negative feedback to correct the performance of a mechanism. The standard voltage is 4.8 V DC, however 6 V and 12 V has also been seen for a few servos. The control signal is a digital PWM signal with a 50 Hz frame rate. Within each 20 ms timeframe, an active-high digital pulse controls the position. The pulse nominally ranges from 1.0 ms to 2.0 ms with 1.5 ms always being center of range.
Figure 41: Inside Servo Motor [13]
Angular position motors measure their current position with a sensor (rotary encoder, hall sensors, potentiometer, etc.) and then use a feedback system to adjust the voltage in the motor until the position they measure is stable and at the value that they want. The same can be done for a motor if you change the sensor to a force sensor and put the measurement into a feedback system.
Torque is a better property to talk about than force with a motor, and most motor datasheets have a value called the torque constant Kt in units of N-m/Amp. You can use a current sense resistor, current mirror, or some other current sensor that tells you how much current is going into the motor, and then you can calculate the amount of Torque the motor is applying. Combine this with your position sensor, and you can determine over how many newtons are applied per distance, and combining these two measurements you can put them into a feedback system that will vary the voltage into the motor to keep the force at the value you command it.
Servos are controlled by sending an electrical pulse of variable width, or pulse width modulation (PWM), through the control wire. There is a minimum pulse, a maximum pulse, and a repetition rate. Servo motors can usually only turn 90 degrees in either direction for a total of 180 degree movement. The motor's neutral position is defined as the position where the servo has the same amount of potential rotation in the both the clockwise or counter-clockwise direction. The PWM sent to the motor determines position of the shaft, and based on the duration of the pulse sent via the control wire; the rotor will turn to the desired position. The servo motor expects to see a pulse every 20 milliseconds (ms) and the length of the pulse will determine how far the motor turns. For example, a 1.5ms pulse will make the motor turn to the 90-degree position. Shorter than 1.5ms moves it to 0 degrees, and any longer than 1.5ms will turn the servo to 180 degrees [14], as diagramed below
Figure 42: Variable Pulse width control servo position [14]
When these servos are commanded to move, they will move to the position and hold that position. If an external force pushes against the servo while the servo is holding a position, the servo will resist from moving out of that position. The maximum amount of force the servo can exert is called the torque rating of the servo. Servos will not hold their position forever though; the position pulse must be repeated to instruct the servo to stay in position [14]
Chapter 6
Circuit Design
We used a software simulator for electronics called, Fritzing. It is an open-source hardware initiative to support designers, artists, researchers and hobbyists to work creatively with interactive electronics. We are creating a software tool, a community website and services in the spirit of Processing and Ardiuno, fostering an ecosystem that allows users to document their prototypes, share them with others, teach electronics in a classroom, and layout and manufacture professional PCBs. [15]
Figure 43: RAKS Circuit Simulator [21]
Figure 44: RAKS Circuit PCB[21]
Figure 45: Circuit Schematic[21]
Chapter 7
Mechanical Calculations
The Hardware pieces are simply made from aluminum; most of them have a 3mm thicknesses the parts in Figure 41.
Figure 46: RAKS Parts
I have made some modifications to let the arm work freely without extra voltage required. Also I have some errors at the beginning, but I overcome them, with redesigning another draft pages. But don’t worry now you can refer to chapter 2 where I list all the CAD designs. Here is an example of some modifications, like Figure 42
Figure 47: RAKS ARM & PART C
Figure 48: RAKS BASE PART
Well figure 43 shows that we used three pieces to the servo aligned correctly. Also to mount this on the base.
Figure 49: Some of Part A
Figure 44 shows the joint that play the role of an elbow.
Figure 50: RAKS PART B
Figure 45 shows part B where the second joint after the base.
Figure 51: Overall Parts
Of course Part A, A Base, B and C are done using the water jet matching like in Figure 2. So those all the parts of the RAKS as shown in Figure 46.Figure 47 is the overall view.
Figure 52: RAKS OVERVIEW
References
Books,
Websites;
Videos;
Software;
Appendix
The Ardiuno Mega 2560 is a microcontroller board based on the ATmega2560 (datasheet). It has 54 digital input/output pins (of which 14 can be used as PWM outputs), 16 analog inputs, 4 UARTs (hardware serial ports), a 16 MHz crystal oscillator, a USB connection, a power jack, an ICSP header, and a reset button. It contains everything needed to support the microcontroller; simply connect it to a computer with a USB cable or power it with a AC-to-DC adapter or battery to get started. The Mega is compatible with most shields designed for the Ardiuno Duemilanove or Diecimila.
53
54
55
The Ardiuno Uno is a microcontroller board based on the ATmega328 (datasheet). It has 14 digital input/output pins (of which 6 can be used as PWM outputs), 6 analog inputs, a 16 MHz ceramic resonator, a USB connection, a power jack, an ICSP header, and a reset button. It contains everything needed to support the microcontroller; simply connect it to a computer with a USB cable or power it with a AC-to-DC adapter or battery to get started.
56
57
58
59
60
61
62
63 64
65 66
67 68
69 70
71 72
73
//
// Descroption: This code reads joint coordinates from Kinect then turns
// Left and Right servos accordingly
// By: TechBitar/Hazim Bitar
// Modified By : Rafed Al Doohan
// Email: techbitar at gmail dot com
// Date: Jan 31, 2011
// Modified Date: May 1, 2013
// Email :rafed@ack-raks.com
#include <Servo.h>
Servo servoA;
Servo servoB;
Servo servoC;
Servo servoD;
Servo servoE;
void setup()
{
// start serial port at 9600 bps:
Serial.begin(9600);
servoA.attach(3);
servoA.write(90);
servoB.attach(4);
servoB.write(90);
servoE.attach(5);
servoE.write(90);
servoC.attach(6);
servoC.write(90);
servoD.attach(10);
servoD.write(90);
}
unsigned char x,y,z,j,l,m,n,o,p,q =0;
int val = 0;
void loop()
{
// read the 4-byte byffer sent from C# containing coordinates and joint ID.
if (Serial.available() >= 10) {
x = Serial.read();
y = Serial.read();
z = Serial.read();
j = Serial.read();
l = Serial.read();
m = Serial.read();
n = Serial.read();
o = Serial.read();
p = Serial.read();
q = Serial.read();
if (j == 7) { // HandLeft
servoA.write(180-y); // sets the servo position according to the scaled value
delay(10); // waits for the servo to get there
}
if (j == 11) { // HandRight
servoB.write(y); // sets the servo position according to the scaled value
delay(10); // waits for the servo to get there
}
}
}
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Windows;
using System.Windows.Controls;
using System.Windows.Data;
using System.Windows.Documents;
using System.Windows.Input;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using System.Windows.Navigation;
using System.Windows.Shapes;
using Microsoft.Kinect;
using System.IO.Ports;
using Coding4Fun.Kinect.Wpf;
using Microsoft.Research.Kinect.Nui;
using KinectNUI.Business.Kinect;
namespace RAKS_Software
{
public partial class MainWindow : Window
{
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
private const float RenderWidth = 640.0f;
private const float RenderHeight = 480.0f;
private const double JointThickness = 3;
private const double BodyCenterThickness = 10;
private const double ClipBoundsThickness = 10;
private readonly Brush centerPointBrush = Brushes.Blue;
private readonly Brush trackedJointBrush = new SolidColorBrush(Color.FromArgb(255, 68, 192, 68));
private readonly Brush inferredJointBrush = Brushes.Yellow;
private readonly Pen trackedBonePen = new Pen(Brushes.Green, 6);
private readonly Pen inferredBonePen = new Pen(Brushes.Gray, 1);
private KinectSensor sensor;
private DrawingGroup drawingGroup;
private DrawingImage imageSource;
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
//bool closing = false;
// bool _continue;
//bool PersonDetected;
SerialPort _serialPort;
//int ScreenMaxX = 180;
//int ScreenMaxY = 180;
const int skeletonCount = 6;
Skeleton[] allSkeletons = new Skeleton[skeletonCount];
KinectSensor Sensor;
Runtime Nui = new Runtime();
public MainWindow()
{
InitializeComponent();
}
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
private static void RenderClippedEdges(Skeleton skeleton, DrawingContext drawingContext)
{
if (skeleton.ClippedEdges.HasFlag(FrameEdges.Bottom))
{
drawingContext.DrawRectangle(
Brushes.Red,
null,
new Rect(0, RenderHeight - ClipBoundsThickness, RenderWidth, ClipBoundsThickness));
}
if (skeleton.ClippedEdges.HasFlag(FrameEdges.Top))
{
drawingContext.DrawRectangle(
Brushes.Red,
null,
new Rect(0, 0, RenderWidth, ClipBoundsThickness));
}
if (skeleton.ClippedEdges.HasFlag(FrameEdges.Left))
{
drawingContext.DrawRectangle(
Brushes.Red,
null,
new Rect(0, 0, ClipBoundsThickness, RenderHeight));
}
if (skeleton.ClippedEdges.HasFlag(FrameEdges.Right))
{
drawingContext.DrawRectangle(
Brushes.Red,
null,
new Rect(RenderWidth - ClipBoundsThickness, 0, ClipBoundsThickness, RenderHeight));
}
}
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
private void Window_Loaded(object sender, RoutedEventArgs e)
{
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
this.drawingGroup = new DrawingGroup();
this.imageSource = new DrawingImage(this.drawingGroup);
Image.Source = this.imageSource;
foreach (var potentialSensor in KinectSensor.KinectSensors)
{
if (potentialSensor.Status == KinectStatus.Connected)
{
this.sensor = potentialSensor;
break;
}
}
if (null != this.sensor)
{
// Turn on the skeleton stream to receive skeleton frames
this.sensor.SkeletonStream.Enable();
// Add an event handler to be called whenever there is new color frame data
this.sensor.SkeletonFrameReady += this.SensorSkeletonFrameReady;
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
kinectSensorChooser1.KinectSensorChanged += new DependencyPropertyChangedEventHandler(kinectSensorChooser1_KinectSensorChanged);
Nui.SkeletonFrameReady += new EventHandler<Microsoft.Research.Kinect.Nui.SkeletonFrameReadyEventArgs>(Nui_SkeletonFrameReady);
Nui.Initialize(RuntimeOptions.UseSkeletalTracking);
#region TransformSmooth
Nui.SkeletonEngine.TransformSmooth = true;
Microsoft.Kinect.TransformSmoothParameters parameters = new Microsoft.Kinect.TransformSmoothParameters();
parameters.Smoothing = 0.8f;
parameters.Correction = 0.3f;
parameters.Prediction = 0.4f;
parameters.JitterRadius = 1.0f;
parameters.MaxDeviationRadius = 0.5f;
//Nui.SkeletonEngine.SmoothParameters = Microsoft.Kinect.TransformSmoothParameters parameters ;
#endregion
//Arduino Start
//ArduinoSetSerial();
//ArduinoOpenSerial();
//Arduino End
}
//+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
try
{
KinectSensor.KinectSensors.StatusChanged += Kinects_StatusChanged;
foreach (KinectSensor kinect in KinectSensor.KinectSensors)
{
if (kinect.Status == KinectStatus.Connected)
{
Sensor = kinect;
break;
}
}
if (KinectSensor.KinectSensors.Count == 0)
MessageBox.Show("No kinect found");
else
Initialize();
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
//+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
}
// Start the sensor!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
void Kinects_StatusChanged(object sender, StatusChangedEventArgs e)
{
switch (e.Status)
{
case KinectStatus.Connected:
if (Sensor == null)
{
Sensor = e.Sensor;
Initialize();
}
break;
case KinectStatus.Disconnected:
if (Sensor == null)
{
Clean();
MessageBox.Show("Kinect was Disconnected");
}
break;
case KinectStatus.NotReady:
break;
case KinectStatus.NotPowered:
if (Sensor == null)
{
Sensor = e.Sensor;
MessageBox.Show("Kinect is no longer powered");
}
break;
default:
MessageBox.Show("Unhandled Status: " + e.Status);
break;
}
}
private void Initialize()
{
if (Sensor == null)
return;
Sensor.Start();
}
private void Clean()
{
if (Sensor != null)
{
Sensor.Stop();
Sensor = null;
}
}
// Start the sensor!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
private void SensorSkeletonFrameReady(object sender, Microsoft.Kinect.SkeletonFrameReadyEventArgs e)
{
Skeleton[] skeletons = new Skeleton[0];
using (Microsoft.Kinect.SkeletonFrame skeletonFrame = e.OpenSkeletonFrame())
{
if (skeletonFrame != null)
{
skeletons = new Skeleton[skeletonFrame.SkeletonArrayLength];
skeletonFrame.CopySkeletonDataTo(skeletons);
}
}
using (DrawingContext dc = this.drawingGroup.Open())
{
// Draw a transparent background to set the render size
dc.DrawRectangle(Brushes.Black, null, new Rect(0.0, 0.0, RenderWidth, RenderHeight));
if (skeletons.Length != 0)
{
foreach (Skeleton skel in skeletons)
{
RenderClippedEdges(skel, dc);
if (skel.TrackingState == Microsoft.Kinect.SkeletonTrackingState.Tracked)
{
this.DrawBonesAndJoints(skel, dc);
}
else if (skel.TrackingState == Microsoft.Kinect.SkeletonTrackingState.PositionOnly)
{
dc.DrawEllipse(
this.centerPointBrush,
null,
this.SkeletonPointToScreen(skel.Position),
BodyCenterThickness,
BodyCenterThickness);
}
}
}
// prevent drawing outside of our render area
this.drawingGroup.ClipGeometry = new RectangleGeometry(new Rect(0.0, 0.0, RenderWidth, RenderHeight));
}
}
private void DrawBonesAndJoints(Skeleton skeleton, DrawingContext drawingContext)
{
// Render Torso
this.DrawBone(skeleton, drawingContext, JointType.Head, JointType.ShoulderCenter);
this.DrawBone(skeleton, drawingContext, JointType.ShoulderCenter, JointType.ShoulderLeft);
this.DrawBone(skeleton, drawingContext, JointType.ShoulderCenter, JointType.ShoulderRight);
this.DrawBone(skeleton, drawingContext, JointType.ShoulderCenter, JointType.Spine);
this.DrawBone(skeleton, drawingContext, JointType.Spine, JointType.HipCenter);
this.DrawBone(skeleton, drawingContext, JointType.HipCenter, JointType.HipLeft);
this.DrawBone(skeleton, drawingContext, JointType.HipCenter, JointType.HipRight);
// Left Arm
this.DrawBone(skeleton, drawingContext, JointType.ShoulderLeft, JointType.ElbowLeft);
this.DrawBone(skeleton, drawingContext, JointType.ElbowLeft, JointType.WristLeft);
this.DrawBone(skeleton, drawingContext, JointType.WristLeft, JointType.HandLeft);
// Right Arm
this.DrawBone(skeleton, drawingContext, JointType.ShoulderRight, JointType.ElbowRight);
this.DrawBone(skeleton, drawingContext, JointType.ElbowRight, JointType.WristRight);
this.DrawBone(skeleton, drawingContext, JointType.WristRight, JointType.HandRight);
// Left Leg
this.DrawBone(skeleton, drawingContext, JointType.HipLeft, JointType.KneeLeft);
this.DrawBone(skeleton, drawingContext, JointType.KneeLeft, JointType.AnkleLeft);
this.DrawBone(skeleton, drawingContext, JointType.AnkleLeft, JointType.FootLeft);
// Right Leg
this.DrawBone(skeleton, drawingContext, JointType.HipRight, JointType.KneeRight);
this.DrawBone(skeleton, drawingContext, JointType.KneeRight, JointType.AnkleRight);
this.DrawBone(skeleton, drawingContext, JointType.AnkleRight, JointType.FootRight);
// Render Joints
foreach (Microsoft.Kinect.Joint joint in skeleton.Joints)
{
Brush drawBrush = null;
if (joint.TrackingState == Microsoft.Kinect.JointTrackingState.Tracked)
{
drawBrush = this.trackedJointBrush;
}
else if (joint.TrackingState == Microsoft.Kinect.JointTrackingState.Inferred)
{
drawBrush = this.inferredJointBrush;
}
if (drawBrush != null)
{
drawingContext.DrawEllipse(drawBrush, null, this.SkeletonPointToScreen(joint.Position), JointThickness, JointThickness);
}
}
}
private Point SkeletonPointToScreen(SkeletonPoint skelpoint)
{
DepthImagePoint depthPoint = this.sensor.CoordinateMapper.MapSkeletonPointToDepthPoint(skelpoint, DepthImageFormat.Resolution640x480Fps30);
return new Point(depthPoint.X, depthPoint.Y);
}
private void DrawBone(Skeleton skeleton, DrawingContext drawingContext, JointType jointType0, JointType jointType1)
{
Microsoft.Kinect.Joint joint0 = skeleton.Joints[jointType0];
Microsoft.Kinect.Joint joint1 = skeleton.Joints[jointType1];
// If we can't find either of these joints, exit
if (joint0.TrackingState == Microsoft.Kinect.JointTrackingState.NotTracked ||
joint1.TrackingState == Microsoft.Kinect.JointTrackingState.NotTracked)
{
return;
}
// Don't draw if both points are inferred
if (joint0.TrackingState == Microsoft.Kinect.JointTrackingState.Inferred &&
joint1.TrackingState == Microsoft.Kinect.JointTrackingState.Inferred)
{
return;
}
// We assume all drawn bones are inferred unless BOTH joints are tracked
Pen drawPen = this.inferredBonePen;
if (joint0.TrackingState == Microsoft.Kinect.JointTrackingState.Tracked && joint1.TrackingState == Microsoft.Kinect.JointTrackingState.Tracked)
{
drawPen = this.trackedBonePen;
}
drawingContext.DrawLine(drawPen, this.SkeletonPointToScreen(joint0.Position), this.SkeletonPointToScreen(joint1.Position));
}
//private void CheckBoxSeatedModeChanged(object sender, RoutedEventArgs e)
//{
// if (null != this.sensor)
// {
// if (this.checkBoxSeatedMode.IsChecked.GetValueOrDefault())
// {
// this.sensor.SkeletonStream.TrackingMode = SkeletonTrackingMode.Seated;
// }
// else
// {
// this.sensor.SkeletonStream.TrackingMode = SkeletonTrackingMode.Default;
// }
// }
//}
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
void Nui_SkeletonFrameReady(object sender, Microsoft.Research.Kinect.Nui.SkeletonFrameReadyEventArgs e)
{
//frame1 Skeletons1 = e.SkeletonFrame;
////get the first tracked skeleton
//SkeletonData skeleton = (from s in Skeletons1.Skeletons
// where s.TrackingState == SkeletonTrackingState.Tracked
// select s).FirstOrDefault();
//if (skeleton != null)
//{
// //set position
// SetEllipsePosition(handLeft, skeleton.Joints[JointID.HandLeft]);
// SetEllipsePosition(elbowLeft, skeleton.Joints[JointID.ElbowLeft]);
// SetEllipsePosition(shoulderLeft, skeleton.Joints[JointID.ShoulderLeft]);
// SetEllipsePosition(handRight, skeleton.Joints[JointID.HandRight]);
// SendToArduino(skeleton.Joints(JointID.HandLeft), JointID.HandLeft);
// SendToArduino(skeleton.Joints(JointID.ElbowLeft), JointID.ElbowLeft);
// SendToArduino(skeleton.Joints(JointID.ShoulderLeft), JointID.ShoulderLeft);
// SendToArduino(skeleton.Joints(JointID.HandRight), JointID.HandRight);
//}
}
//private void SetEllipsePosition(FrameworkElement ellipse, Joint joint)
//{
// //string text2 = ("(X): " + joint.Position.X + ", (Y): " + joint.Position.Y + " (Z): " + joint.Position.Z);
// //using (System.IO.StreamWriter file = new System.IO.StreamWriter(@"C:\Temp\WriteText2.txt", true))
// //{
// // file.WriteLine(text2); // [-1,1]
// //}
// var scaledJoint = joint.ScaleTo(ScreenMaxX, ScreenMaxY, .5f, .2f);
// Canvas.SetLeft(ellipse, scaledJoint.Position.X);
// Canvas.SetTop(ellipse, scaledJoint.Position.Y);
//}
//void SendToArduino(Joint joint, JointID JID)
//{
// var scaledJoint = joint.ScaleTo(ScreenMaxX, ScreenMaxY, .5f, .2f);
// if(JointID.HandLeft == JID)
// {
// textBox1.Text = scaledJoint.Position.X;
// textBox2.Text = scaledJoint.Position.Y;
// }
// if (JointID.ElbowLeft == JID)
// {
// textBox1.Text = scaledJoint.Position.X;
// textBox2.Text = scaledJoint.Position.Y;
// }
// if (JointID.ShoulderLeft == JID)
// {
// textBox1.Text = scaledJoint.Position.X;
// textBox2.Text = scaledJoint.Position.Y;
// }
// if (JointID.HandRight == JID)
// {
// textBox1.Text = scaledJoint.Position.X;
// textBox2.Text = scaledJoint.Position.Y;
// }
// ArduinoSendByte(scaledJoint.Position.X,scaledJoint.Position.Y);
//}
void kinectSensorChooser1_KinectSensorChanged(object sender, DependencyPropertyChangedEventArgs e)
{
KinectSensor old = (KinectSensor)e.OldValue;
StopKinect(old);
KinectSensor sensor = (KinectSensor)e.NewValue;
if (sensor == null)
{
return;
}
}
private void ArduinoSetSerial()
{
string ArduinoCom = textBox9.Text;
_serialPort = new SerialPort();
_serialPort.PortName="COM" + Trim(textBox9.Text);
_serialPort.BaudRate=9600;
_serialPort.DataBits=8;
_serialPort.Handshake=0;
_serialPort.ReadTimeout=500;
_serialPort.WriteTimeout=500;
}
private string Trim(string p)
{
throw new NotImplementedException();
}
private void ArduinoOpenSerial()
{
if(!_serialPort.IsOpen )
_serialPort.Open();
else
MessageBox.Show("ARDUINO: SERIAL PORT CANNOT BE OPENED");
// _continue = true;
}
private void ArduinoCloseSerial()
{
if(_serialPort.IsOpen)
{
_serialPort.Close();
}
}
//private void ArduinoSendByte(Single kinect_x, Single kinect_y, Single kinect_z, int kinect_j)
//{
// byte x,y,z,j;
// Single sx,sy;
// int HowOften;
// textBox9.Text="NA";
// x=Math.Abs(sbyte kinect_x);
// y=Math.Abs(sbyte kinect_y);
// z=sbyte(kinect_z) ;
// j=sbyte(kinect_j) ;
// x=x;
//}
//public void ConvertByteSingle(byte x) {
// float kinect_x;
// // Byte to float conversion will not overflow.
// kinect_x = System.Convert.ToSingle(x);
// System.Console.WriteLine("The byte as a float is {0}.",kinect_x);
// // Float to byte conversion can overflow.
// try
// {
// x = System.Convert.ToByte(kinect_x);
// System.Console.WriteLine("The float as a byte is {0}.",x);
// }
// catch (System.OverflowException)
// {
// System.Console.WriteLine("The float value is too large for a byte.");
// }
private void StopKinect(KinectSensor sensor)
{
if (sensor != null)
{
if (sensor.IsRunning)
{
//stop sensor
sensor.Stop();
//stop audio if not null
if (sensor.AudioSource != null)
{
sensor.AudioSource.Stop();
}
}
}
}
private void Window_Closed(object sender, EventArgs e)
{
Nui.Uninitialize();
}
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
private void Window_Closing(object sender, System.ComponentModel.CancelEventArgs e)
{
if (null != this.sensor)
{
this.sensor.Stop();
}
}
//Skeleton Code++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
} }
End of Document