We are attempting to correct "drift" that the copter experiences after take-off. One idea is to use the highly synthesized data from the Kinect's odometry. We pulled odom data from the Kinect using ratabmap_ros, then simulated movement along the x and y axes that is similar to what we are experiencing in flight:
By subscribing to the Kinect's pose, we should be able to adjust pitch and roll in our flight script to hold the vehicle's position.
Monday, November 28, 2016
Tuesday, November 22, 2016
Recording Flights From MultiWii
We are using PyMultiwii to record data as we fly:
We now have some data from a take-off at a frequency of 20Hz:
[roll],[pitch],[yaw],[throttle]
This is a graph of the data for the 20 second:
Between two and four seconds, the yaw goes to 1800 which arms the copter, by the sixth second the throttle is at it's MINTHROTTLE, 1350, and we are increasing the throttle. We take-off at about eight seconds.
This is the take-off in greater detail:
while(True):
self.board.getData(MultiWii.RC) s = str(self.board.rcChannels['roll']) + ", " + str(self.board.rcChannels['pitch']) + ", " + str(self.board.rcChannels['yaw']) + ", " + str(self.board.rcChannels['throttle'])
with open("data.txt", "a") as myfile:
myfile.write(s + "\n")
time.sleep(0.05)
We now have some data from a take-off at a frequency of 20Hz:
[roll],[pitch],[yaw],[throttle]
1501, 1503, 1511, 1331 1501, 1503, 1511, 1339 1501, 1503, 1511, 1347 1501, 1503, 1511, 1359 1501, 1503, 1511, 1367 1501, 1503, 1511, 1371 1501, 1503, 1511, 1384 1501, 1503, 1511, 1390 1501, 1503, 1511, 1397 1501, 1503, 1511, 1411 1501, 1503, 1511, 1431 1501, 1503, 1511, 1439 1501, 1503, 1511, 1447 1501, 1503, 1511, 1456 1501, 1503, 1511, 1462 1501, 1503, 1511, 1465 1501, 1503, 1511, 1468 1501, 1503, 1511, 1478 1501, 1503, 1511, 1478 1501, 1503, 1511, 1478 1501, 1503, 1511, 1481 1501, 1503, 1511, 1487 1501, 1503, 1511, 1490 1501, 1503, 1511, 1490 1501, 1503, 1511, 1490 1501, 1503, 1511, 1490 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1511, 1492 1501, 1503, 1509, 1492 1501, 1503, 1496, 1492 1501, 1503, 1484, 1492 1501, 1503, 1467, 1492 1501, 1503, 1445, 1492 1501, 1503, 1425, 1492 1501, 1503, 1417, 1492 1501, 1503, 1415, 1492 1501, 1503, 1404, 1494 1501, 1503, 1399, 1500 1501, 1503, 1394, 1505 1501, 1503, 1394, 1505 1501, 1503, 1394, 1505 1501, 1503, 1401, 1509 1501, 1503, 1408, 1515 1501, 1503, 1408, 1517 1501, 1503, 1408, 1517 1501, 1503, 1408, 1517 1501, 1503, 1408, 1517 1501, 1503, 1408, 1517 1501, 1503, 1408, 1517 1501, 1503, 1408, 1517 1501, 1503, 1408, 1517
This is a graph of the data for the 20 second:
Between two and four seconds, the yaw goes to 1800 which arms the copter, by the sixth second the throttle is at it's MINTHROTTLE, 1350, and we are increasing the throttle. We take-off at about eight seconds.
This is the take-off in greater detail:
Friday, November 18, 2016
CRIUS AIOP v2 Sensor Config
Problem: Slow reaction to change in yaw; copter rocks back and forth, shakes before take-off, change in mode has no affect.
Solution: Manually configure the orientation of the sensors for the MultiWii (config.h; line ~200):
#define FORCE_ACC_ORIENTATION(X, Y, Z){imu.accADC[ROLL] = -X;
imu.accADC[PITCH] = -Y;
imu.accADC[YAW] = Z;}
#define FORCE_GYRO_ORIENTATION(X, Y, Z)
{imu.gyroADC[ROLL] = Y; imu.gyroADC[PITCH] = -X;
imu.gyroADC[YAW] = -Z;}
#define FORCE_MAG_ORIENTATION(X, Y, Z){
imu.magADC[ROLL] = X; imu.magADC[PITCH] = Y;
imu.magADC[YAW] = -Z;}
Monday, November 7, 2016
Kinect Tutorial
NOTE before starting: I have not done a top to bottom run of this yet. I will be doing it soon though to double check. There may be some things that I left out because I am not a perfect note taker. Just waiting for another SD card so I can do the install without wasting my current progress.
I had to reboot after this as well.
Now we can use ROS. If you are confused on how ROS works, the wiki is great at explaining things but is a little more in depth. I am bad at explaining things but can explain it in a dumb easy to understand way since that is the way I think. I'll put a link to a future post here. I have put together a list of handy commands here to get the kinect up and running in ROS
Here is how you start the core ROS process which will be the "master"
I like to run the process in the foreground and just open a new terminal but that is up to you. In a new terminal you can start the node for the kinect by running
ROS should be publishing topics now. A topic is like a data object or an endpoint of an API. You can see the topics by running You will notice that one of the endpoints is /camera/rgb/image_color. We can look at the RGB image with image_view like so
That should bring up a small window in which you can see what your Kinect sees in RGB. Now you can mess with it and see some other cool stuff as well since the Kinect node publishes several different data types.
Install Ubuntu 16.04 Xenial on the Pi
Ok, lets do this. First we need to download the image for the Pi. I am betting this will work with the Pi 2 as well but I only tested it on the Pi 3.Here are some links to some of the Pi images:
- Raspberry Pi 2
- Raspberry Pi 3 - Thanks to Ryan Finnie
I used an Ubuntu Laptop to format the SD card and write the image to it. But here is the tutorials from the Raspberry Pi foundation:
Now boot up the raspberry pi. If you used the image from Ryan Finnie, the username/password is ubuntu. You will be prompted to change the password so remember what you change it to.
Install ROS (Kinetic Distro)
We have some updates first
Then install ROS and setup your work space
Install Some ROS Packages
ROS has some packages that you can now install via apt. Here are the ones that I installed to work with the kinect and a few others that are useful as well
I had to reboot after this as well.
Using Your Kinect With ROS
Here is how you start the core ROS process which will be the "master"
I like to run the process in the foreground and just open a new terminal but that is up to you. In a new terminal you can start the node for the kinect by running
ROS should be publishing topics now. A topic is like a data object or an endpoint of an API. You can see the topics by running You will notice that one of the endpoints is /camera/rgb/image_color. We can look at the RGB image with image_view like so
That should bring up a small window in which you can see what your Kinect sees in RGB. Now you can mess with it and see some other cool stuff as well since the Kinect node publishes several different data types.
Bill of Materials
- Raspberry Pi 3
- Kinect for Xbox 360 - I got mine at GameStop for ~$35 w/ the AC adapter
- AC Adapter
- If you don't already have a bunch of Micro USB Cables, you will need one for the raspberry pi
- Keyboard & Mouse for working on the pi
Works Cited
- https://www.youtube.com/watch?v=OqOkpZBOpxY
- http://stackoverflow.com/questions/23901220/how-do-i-get-kinect-depth-image-data-in-centimeters-using-the-libfreenect-python
- http://mathnathan.com/2011/02/depthvsdistance/
- https://groups.google.com/forum/#!topic/openkinect/k6exs5hDyQ4
- http://wiki.ros.org/rosbag/Code%20API
- http://www.cs.bham.ac.uk/internal/courses/int-robot/2015/notes/map.php
- http://wiki.ros.org/kinetic/Installation/Ubuntu
Thursday, November 3, 2016
Wednesday, October 12, 2016
Monday, September 19, 2016
Weight and Thrust Considerations
To make sure the copter can achieve flight, motors must be able to achieve thrust greater than the force of gravity acting on the copter. Assuming a goal of maximum thrust being twice the force from gravity, each motor must be able to achieve enough thrust to lift one quarter of twice the mass of the vehicle.
Battery Mass: 370.5g
Frame, ESC, Motor, Pi, Crius AIOP Mass: 750g
Kinnect Mass: 390.5g
Total Vehicle Mass: 1.511Kg
Mass/motor: 378g
Therefore each motor should be capable of enough thrust to lift 750g.
Thursday, September 8, 2016
Current Schedule
- Cut out
(Sept 8, 2016) - Order Board from OSHPARK
(Sept 8, 2016) - Order Parts from Mouser
(Sept 8, 2016) - API setup for Multiwii
(Sept 8, 2016) - Kinect Depth Data Reading
(Sept 8, 2016) - Basic Automatic Flight Software Framework
(Sept 15, 2016) - Custom PCB Wired Up
(Sept 22, 2016) - Motor Test, No props (
Sept 22, 2016) - Hand Guided Flight Simulation (Sept 27, 2016)
- First Flight Test (October 6, 2016)
Tuesday, September 6, 2016
Battery Regulator
The Kinect for the Xbox 360 uses two power sources: +5V from the USB and 12V 1.2A from an AC adaptor or the Xbox 360 itself. In order to use the Kinect for our SLAM drone, we needed to provide the second power source independently. Since the copter is running on 14.4V LiPo, we needed to create a power regulator circuit to step the voltage down and limit the current to 1 amp.
We decided to use TI's LM2675, a 12V 1A buck regulator in a small IC package:
Copper pour for our first layout. |
First layout for our 12V 1A Regulator |
We ordered the first prototype for the board on 9-5:
![]() |
OSHPark PCB Visualization |
Acrylic Mount
To secure the devices to the copter's body, we created a custom mount cut from acrylic:
The center slats connect directly to the bottom of our frame, on the top mount we attached the Pi and Flight Controller, then under the second mount we attached the Kinect:
![]() |
Designs for Acrylic Mount |
![]() |
Original plan to mount components to quadcopter. The 3D model for the Kinect and Pi were found on Sketchup's 3D Warehouse. |
Tuesday, August 30, 2016
Milestones
Upcoming Milestones
Completed Milestones
- Serial control of flight controller from Raspberry Pi
- WIFI connection of Pi to Ground Control
- Control Flight Controller from Ground Control
- Complete hardware assembly
- Laser-cut acrylic frame to secure controllers to copter
- Blade guards (Laser-cut acrylic?)
- Fix Kinect to bottom of copter, may need to extend legs on frame
- Balance components
- Flight tests
- Implement SLAM algorithms
Completed Milestones
- Kinect controlled via Raspberry Pi
- Point cloud extracted from Kinect
- New flight controller installed and tested
- Battery power for all components
Electronic Components
![]() |
Electronic Components Running on Battery Power |
- Main Controller: Raspberry Pi 3
- Ability to run Robot Operating System (ROS), which has existing open-source code for Kinect operability
- Built in WIFI for communication with ground control
- Easy serial communication with Flight Controller
- Flight Controller: Crius All-In-One Pro v2.0
- Running open-source MultiWii platform
- On board IMU, magnetometer, and barometer
- Ability to fine-tune PIDs
- Sensor: Xbox 360 Kinect
- ESCs:
HobbyWing 20ATurnigy Plush 30A ESC w/ BEC - Motors:
Suppo BL-2208/12NTM 2826-1100Kv/ 252w - Battery: Pulse Ultra 3300mAh 14.8V 35C LiPo
![]() |
Block Diagram of Electronic Components |
Summer Progress
In the very first meeting we had over the summer we made the decision to make a huge change to our project. We realized that we would be extremely lucky if we could get through the project without breaking the lidar sensor at some point. And with a "cheap" lidar unit costing several hundred dollars, we decided we needed to try a different option. In the end we decided to go with an Xbox 360 Kinect sensor. The Kinect has both a regular RGB camera and a depth camera equipped. The best part about it is that it only costs around $30 at your local Gamestop. There is also plenty of support for the Kinect in the ROS environment. The room mapping with the Kinect will also be much more appealing. Instead of the two dimensional bird-eye-view room mapping that the lidar would provide, the Kinect will enable us to recreate the room as a point cloud in three dimensions and with color. We are very satisfied with our decision to switch to the Kinect, especially since we have already fried two Kinects over the summer which would have bankrupted us if they were lidar units.
Other decisions that we made over the summer were to use the Multiwii flight controller. This open source project seems to work well with the Raspberry Pi we are using as the brains of our quad-copter. It is well known for being cheap and simple while also being high quality.
As for a summer progress report, we were able to get ROS up and running on the Raspberry Pi and on the server computer that the Pi will report to. We hope to have the Pi report to the server in real time so that you can see the map as it is being generated, but that will be something extra to add once we have the base project running. We were able to get the Kinect working with a ROS package called rgbdslam and are able to generate colored three dimensional point clouds of the room by moving the Kinect around the room. We also managed to wire the Kinect, flight controller, and Pi to be battery powered.
The next goals that we are working on are to get the Pi and Multiwii talking to each other so that we can get our quad-copter off the ground and to come up with an obstacle avoidance routine using the Kinect's depth camera.
Other decisions that we made over the summer were to use the Multiwii flight controller. This open source project seems to work well with the Raspberry Pi we are using as the brains of our quad-copter. It is well known for being cheap and simple while also being high quality.
As for a summer progress report, we were able to get ROS up and running on the Raspberry Pi and on the server computer that the Pi will report to. We hope to have the Pi report to the server in real time so that you can see the map as it is being generated, but that will be something extra to add once we have the base project running. We were able to get the Kinect working with a ROS package called rgbdslam and are able to generate colored three dimensional point clouds of the room by moving the Kinect around the room. We also managed to wire the Kinect, flight controller, and Pi to be battery powered.
The next goals that we are working on are to get the Pi and Multiwii talking to each other so that we can get our quad-copter off the ground and to come up with an obstacle avoidance routine using the Kinect's depth camera.
Members
Members
- Ben Antczak - u0376370 [at] utah.edu
- Dusty Argyle - dusty.argyle@utah.edu
- Nick Hallstrom -
Previous Posts
Here are our previous meetings posted on our old website:
Meeting March 10, 2016
For our initial meeting, we decided what we would need to keep track of on the website. We Also made the website available via a repository stored on Bitbucket so it remains private to our team.
We also started to decide a timeline on our milestones. Nothing concrete has been decided but some ballpark timelines were offered and it was decided that we would reflect on them in order to start nailing down our projects deadlines.
Meeting April 14, 2016
This week we met to determine a protocal that we will follow for the bluetooth communication. This will allow us to control the quadcopter with various commands shown in this document here
We decided to take the weekend to try to implement the bluetooth communication as well as work on the IMU. Once we have some basic commands setup for safety and manual navigation, we will be able to do more in-depth testing.
Meeting March 17, 2016
This week we have started ordering parts. Our main focus is the quadcopter at least until summer. In the summer, we will get more serious about integrating the quadcopter and the lidar. This will start to provide a base for our SLAM implementation. The current parts are listed below:
- Aluminum Framing
- Brushless Motors x4
- Electronic Speed Controllers x4
- STM32F407 Microcontroller
- Jumper wires
- Gyrometer
Meeting April 21, 2016
This week we got together and tested the bluetooth thoroughly to make sure that it was the protocal we wanted. We were able to tune the pid using the bluetooth to test. We were also able to run various other commands to manipulate the state of the quadcopter.
We have added some files to show our progress. This is our report hich shows things like the bill of materials, future applications, implementation strategies...etc. Report
We also have a video that you can download and watch here: Video
Meeting March 24, 2016
The week we worked on the quadcopters frame. The aluminum frame provides a good prototype base for our quadcopter. We mounted everything the frame and put the frame in a test harness built with the same aluminum. We then mounted the board onto the quadcopter and started to test the quadcopter's microntroller.
We have gotten a basic setup to start implementation/testing of the quadcopter. We have found one unforeseen issue thus far, we need a wireless kill switch. So we have order a bluetooth module to attach the quadcopter and plan to use it to connect to the computer to safely kill the device remotely.
Meeting March 31, 2016
This week we accomplished a basic PID control of the motors. We also dented the aluminum frame of the quadcopter. We now have order a significantly stronger frame made of a high quality plastic with an on board PCB in case we need it. There is also some special mounts for the quadcopter motors. This frame should be in by the end of the week. We will have to gut the old frame and reset everything we have done thus far.
Meeting April 7, 2016
This week we worked on mounting our components to the frame. We thought we had broken one of the motors but found out that our power supply wasn't providing enough current. We calculated we will need about 8 Amps to drive the motors completely. Shortly after requesting the new power supply we received one to test on. We still have not tested since the quadcopter is still being put back together on the new frame. As soon as this is done and we setup shop, we will test with the old powersupply first to make sure we are properly connected. Once we are in a good state, we will try the new power supply since we don't know if it is in working condition. We are also working on the bluetooth kill switch which came in this week.
Subscribe to:
Posts (Atom)