Engineering Projects/Robot car/Howard Community College/Fall2012/p1-503-mcf

From Wikiversity
Jump to navigation Jump to search

Problem Statement[edit | edit source]

The goal is to turn the toy car into an autonomous vehicle. We will need to put some brain into the toy car so that it is aware of everything in its environment in real time. Our car will have to be able to avoid obstacles that it detects on its path. We will also need a way to take over the car in case it is stuck in a hole or something. This will have to be done remotely.

testing with smaller motors

Team Members[edit | edit source]

Akoumany
Cooke
Foley

Summary[edit | edit source]

We divided the project into three parts that we thought were critical and could push the project forward if complete. We took home the Kinect, Arduino, motors, leds, and a WiFi shield for the Arduino so that we can familiarize ourselves with how everything worked. We read tutorials available on wikiversity to improve our understanding of the task in hand. We also created a page for HCC's Arduino resources. You are welcome.

We downloaded software that are required and started tweaking codes from the tutorials. We started running small motors on the Arduino. We created a power supply for the Diamondback board and successfully connected the shield to a WiFi network. We found out how the Kinect uses colors to determine the distance between different objects in its view and We hooked up the Arduino to the car and got it moving around

Poster[edit | edit source]

Angle

Story[edit | edit source]

Kinect
Arduino 9V Power Supply

For the first week of progress, the team needed to be able to understand the Arduino software and capabilities. Each member took the time to familiarize him- or xim-self with Arduino by trying out the software, hardware, and looking into various compatible products. This resulted in a page on Wikiversity for HCC's available equipment, as well as some ideas of where to go with the project.
In the second week, we focused on individual aspects of the project. We divided the project into three different parts.

  1. Control the car's movement with the Arduino
  2. Work with the Kinect and find how to integrate it's data into the Arduino
  3. Work with a Diamondback WiFi Shield that will allow the sending of command to the car.

We figured out that the Kinect color codes items by distance, how to hook up the Diamondback to the internet, and hooked up an Arduino to various motors and managed the speed. The Diamondback needed a power supply so we had to create one that is powered by a 9volt battery. After tweeking codes we found from tutorials, We were able to connet the Diamondback to our home WiFi network. In attempt to get the Arduino to control the car, We first started hooking up the arduino to Leds and small motors. We used transistors and resistor to control the motor of a small propeller motor. Transitioning this setup to the Arduino proved difficult because of the more powerful motors of the Car. We had to use the Monster Moto shield and an external source of power to get the motor of the car to work. We wrote a tutorial about this here. During the third week, we searched for software capable of utilizing the Kinect to create a three-dimensional map of the environment. The first one we came across was Kintinuous, a 3D-mapping software developed by a team led by Thomas Whelan of the National University of Ireland, Maynooth. This software would have been perfect for our project because it works with the Kinect to "virtually translate and allow the surface to 'fall out' into a large global map," creating a "highly dense map of the observed environment." However, upon further investigation, we found that the Kintinuous software is not for sale or available for download. Instead, we downloaded four Kinect Software Development Kits, including CL NUI, Kinect for Windows, ROS, and Open Kinect. We explored the Kinect for Windows SDK the most because it appeared to be the most useful. Using the KinectExplorer application from the Kinect for Windows SDK, we were able to get live three-dimensional streaming of the environment through the Kinect into the Kinect Studio 3D Viewer Window. Unfortunately, the Kinect for Windows software does not translate the live three-dimensional streaming into a savable, reusable, global map like the Kintinuous software does, so it will not be quite as convenient to use the Kinect for Windows SDK for this project. During the fourth week, we attempted to figure out how to connect the Kinect to the Arduino. We searched through the Kinect sample programs and selected the KinectDepthViewer sample program as a starting point for connecting the Kinect to the Arduino through software. We thought of possible ways to connect them, such as (1) adding the Arduino into the Kinect program so that the Kinect recognizes the Arduino and can communicate with it, (2) adding the Kinect into an Arduino program to accomplish the same purpose the opposite way, (3) having a standalone Kinect program that feeds information into a standalone Arduino program, and (4) having a Kinect program and Arduino program both feed information into a separate program that uses them both to make decisions. We also found certain variables and methods in the KinectDepthViewer sample program that would need to be made public in any case, such as private void ConvertDepthFrame, private void DepthImageReady, outputBitmap, tooNearDepth, tooFarDepth, and unknownDepth.

Decision List[edit | edit source]

Material List[edit | edit source]

  1. Arduino Uno
  2. Sparkplug Monster Motor Shield
  3. A small Toy Car with just front and back wheels
  4. 9v Battery to power the Arduino
  5. Microsoft Kinect
  6. A wi-fi shield for the Arduino

Things that could be incorporated with future toy car projects

  1. Bluetooth Mate Silver Shield (needed)
  2. A G1 android phone could help students who don't already have one. (needed)
  3. USB Host Shield (already in the HCC inventory)

Software List[edit | edit source]

  1. Arduino IDE
  2. Eclipse for Java
  3. Android SDK

Time[edit | edit source]

67 Hours

Tutorials[edit | edit source]

  1. Easy Setup: Arduino and Monster Motor Shield Tutorial
  2. Howard Community College/Arduino Resources

Next Steps[edit | edit source]

The next step would be to start mapping rooms with the Kinect. Take the Kinect data and feed it to the Arduino. Other sensors such as proximity sensors could be used. The next team will also need to find away to send commands to the Arduino using either a WiFi or Bluetooth shield. This probably means that the team will need to create an app or some kind of push button system that tells the Arduino what to do.