Wednesday, June 27, 2018

About Me

My name is Taylor Welker, and I am a Masters Student working as a Research Assistant at the University of Utah.  My advisor is Dr. Tom Henderson in the School of Computing and we are working to help develop an intelligent system for autonomous planning and control of quadcopters.

My role in the research team is to develop the infrastructure for testing this system.  Specifically, I have been over handling hardware related to the quadcopter, as well as for developing both 2D and 3D simulations that can be used to visualize the effectiveness of our methods.

My academic interests are centered around the development of intelligent systems.  Some of my favorite classes so far in my Masters Program include Artificial Intelligence, Machine Learning and Motion Planning.

About the Project

The purpose in creating these Gazebo simulations is to prove the concepts that make up our BRECCIA system.

BRECCIA is an intelligent GIS tool that is designed to assist users in gathering and processing information utilizing autonomous quadcopters.  It contains a database of beliefs, desires and intentions (a BDI framework) and uses it to make decisions on how best to achieve its goals.  If there is uncertainty within the database, it follows a set of plans to try and resolve them.

To read more about our project, please refer to these publications below:

``BRECCIA: A Multi-Agent Data Fusion and Decision Support System for Dynamic Mission Planning,'' David Sacharny, Thomas C. Henderson, Amar Mitiche, Robert Simmons, Taylor Welker and Xiuyi Fan, 2nd Conference on Dynamic Data Driven Application Systems (DDDAS 2017), Cambridge, MA, 7-9 August, 2017.

``BRECCIA: Unified Probabilistic Dynamic Geospatial Intelligence,'' David Sacharny, Thomas C. Henderson, Amar Mitiche, Robert Simmons, Taylor Welker and Xiuyi Fan, IEEE Conference on Intelligent Robots and Systems (IROS 2017 Late Breaking Paper), Vancouver, Canada, 24-28 September, 2017.

``A Probabilistic Logic for Multi-source Heterogeneous Information Fusion,'' Thomas C. Henderson, Robert Simmons, Amar Mitiche, Xuiyi Fan and David Sacharny, IEEE Conference on Multisensor Fusion and Integration, Daegu, South Korea, 15-18 November, 2017.

``BRECCIA: A Novel Multi-source Fusion Framework for Dynamic Geospatial Data Analysis,'' David Sacharny, Thomas C. Henderson, Robert Simmons, Amar Mitiche, Taylor Welker and Xiuyi Fan, IEEE Conference on Multisensor Fusion and Integration, Daegu, South Korea, 15-18 November, 2017.

``Probabilistic Logic for Intelligent Systems,'' Thomas C. Henderson, Robert Simmons, Bernard Serbinowski, Xiuyi Fan, Amar Mitiche, and Michael Cline , International Conference on Intelligent Autonomous Systems, Baden-Baden, Germany, 11-15 June, 2018.

Starting Dronekit-Python

The Command


python ~/dronekit-python/examples/SimMission/SimMission.py --connect 127.0.0.1:14552


Notes


*Be sure to replace "~/dronekit-python/examples/SimMission/SimMission.py" with the filepath to the script you've decided to use.

*Notice how it is connecting to 127.0.0.1:14552, which is one of the ports supplied by Mavproxy (see Starting Mavproxy for details).

*Remember, when Dronekit-Python starts, it searches for an available quadcopter for 30 seconds.  If it can't find one in that time, or loses connection with the quad for that amount of time, it shuts down automatically.  So be sure to have SITL, Mavproxy (and Gazebo if you are using it) running before you start Dronekit-Python.

Starting Mavproxy

The Command

mavproxy.py --master=udp:127.0.0.1:14550 --out=udpout:192.168.0.100:14550 --out=udpout:127.0.0.1:14552 --out=udpout:127.0.0.1:14553


Notes

*I am pretty sure that this represents the default UDP ports that SITL uses if there is one quad in the simulation.  These numbers will change if you try to use more than one.

*For each quadcopter you want to control, you will need another Mavproxy with slightly different UDP ports.

*"--master" is the udp port that Mavproxy is listening to, and the "--out" port are where it sends that information to.  The first "--out" statement is reserved for a special purpose (I think it has something to do with the ip address being starting with 192 rather than the local machine's 127), so we use the second two "--out" statements when starting Dronekit-Python

Starting Ardupilot

The Command


python ~/ardupilot/Tools/autotest/sim_vehicle.py -v ArduCopter -f gazebo-iris --console -L $LOCATION -m --mav10 -I0

Note on GPS initialization


This should start SITL and link it to an available iris quadcopter in Gazebo.  You just have to replace '~' with the filepath to where you downloaded the ardupilot folder from GitHub, and $LOCATION with a certain codename.

GPS is critical to navigating our quadcopter in the simulation.  Therefore, we need to choose a single  GPS source to work from, and sync SITL and Gazebo's coordinate systems so that they both believe that the quadcopter is at the same location. In our simulation, we decided to use the GPS tied to SITL rather than another external one that could be installed into the Gazebo simulation (like 'hector').

We then need to override Gazebo's default behavior (to make the origin represent 0 latitude and 0 longitude).  To do this, we will add a line to ardupilot/Tools/autotest/locations.txt.  This file contains a list of names with corresponding lat, lon, alt specifications.  Create a new name at the bottom of the file, and give it the desired GPS location that you want the origin of your Gazebo simulation to represent.  Then, using the command above, replace $LOCATION with the name you chose for that location.  This will make sure SITL and Gazebo are on the same page with their coordinate systems.


Other Notes


*The "-I0" tells SITL which quadcopter we are starting.  For example, if we have two quadcopters in our simulation, then you would need two of these statements, one ending in -I0, and the other ending in -I1.  The IDs of these quadcopters are 0 and 1 respectively.  There is a spacing of "10" in between UDP ports that these quadcopters will use with each new quadcopter.  For example, if I0 uses a UDP port of 14550, then I1 would use a UDP port of 14560 and I2 would use 14570 see Starting Mavproxy for more details on this topic.

*If you are starting SITL fresh or after a "wipe" (the -w flag), it may have to reload all of its parameters. In this case it can take a little time (30 seconds or so) to initialize.

Starting Gazebo

The Command

GAZEBO_RESOURCE_PATH="$GAZEBO_RESOURCE_PATH:/usr/share/gazebo-9/worlds" gazebo --verbose /usr/share/gazebo-9/worlds/$WORLD_FILE

Notes

*The "GAZEBO_RESOURCE_PATH" stuff just tells gazebo to look in "/usr/share/gazebo-9/worlds" in addition to wherever else it looks for resources when building the world.

*In case this wasn't clear in this format, this command is all one line.  The first part (up to 'worlds" ') is for adding the folder to the Resource path, while "gazebo --verbose" onward is actually starting gazebo and loading it with the world specified after it.

*You will have to modify the filepaths to fit your installation, as well as replacing $WORLD_FILE with the name of your .world file.

*If you want to use an example, I would recommend iris_irlock_demo.world.  This should come with your gazebo install and can be found in your worlds folder.

Tuesday, June 26, 2018

Dronekit-Python

Installation Tutorial


http://python.dronekit.io/develop/installation.html


Purpose

Dronekit-Python is a library of Python code that allows you to send commands to the simulated quadcopter using a Python Script (as opposed to Ardupilot's clunkier command line interface).  This will allow you to write your own Python code to manipulate the behavior of the quad how you see fit. 

For example, I use 3 different scripts depending on the situation: one to allow the user to have manual control over the quad, one to allow the quad to follow a mission in AUTO mode, and one similar to the second, but uses GUIDED mode so that it can change its mission at any time throughout the simulation (but this last one is a topic for another post).

Things to Note


*Be sure to use the examples that come with the Dronekit-Python repository.  These can be modified to fit your needs and can you teach you how to do things like takeoff or navigate to a new position.

*Dronekit-Python's connect function has a default timeout of 30 seconds.  If it cannot find the UDP port corresponding to an available quadcopter (either real or simulated), it will close the program.  If your SITL program has to initialize many parameters, then you might have to consider adding sleep statements or delays to ensure that SITL has enough time to get started before launching Dronekit-Python.

*AUTO mode does not allow anything to dynamically alter the mission stored in the quadcopter.  The quad must first land and disarm its motors before saving a new mission into the quadcopter.  From there, it can take off, and enter AUTO mode again.  This is the advantage of "GUIDED" mode over "AUTO": you can take control of the quad at any point and tell it to do something different.

*The weakness to "GUIDED" mode is that it takes constant input and cannot simply take in a mission file and complete it autonomously like in "AUTO" mode.  In our project, we want to be able to have the quadcopter complete a mission automatically, but also be able to change its mission if the BRECCIA server prompts it to, even while it is in the air.  That is why I designed a dynamic planner in Python.  This planner can receive new missions from our BRECCIA server (which uses Java), and then parse these mission files, sending them to the quadcopter one after the other while it is in "GUIDED" mode.

*Dronekit-Python can only affect the quadcopter's state.  It cannot change the environment (things like wind speed or direction).

Mavproxy

Installation Tutorial

https://ardupilot.github.io/MAVProxy/html/getting_started/download_and_installation.html

Purpose

Mavproxy is used as the link between a quadcopter within your SITL simulation and your Dronekit-Python scripts.  It helps your scripts find UDP ports that an SITL quadcopter is publishing and listening to.  This is not always required (SITL has a default port you can link to if you are using a single quadcopter), but is invaluable if you want to control more than one quadcopter in the same simulation or have multiple Ground Control Stations.


Things to Note




Ardupilot

Installation Tutorial

http://ardupilot.org/dev/docs/setting-up-sitl-on-linux.html

Purpose

Ardupilot is used to simulate the hardware of a quadcopter.  

For those of you not familiar with quadcopters, each one needs a "Flight Control Board".  These simply perform the tasks necessary to keep the quadcopter level, and in the air.  Without them, the pilot would have to control each individual propeller rather than simply control the quadcopter's lateral, vertical and rotational movement (go up/down, left/right, rotate).

Ardupilot is able to create a virtual flight control board in order to allow simulations to behave like the physical version.  Truthfully, this is the only required program to run SITL, while Mavproxy and Dronekit-Python make it simpler to use.


Things to Note


*Ardupilot should be a GitHub repo that is downloaded to your computer.  This folder has subfolders for planes, rovers, quads, etc...  However, we primarily use the ArduCopter SITL option.

*The program to run SITL can be found in Tools/autotest/sim_vehicle.py

*Be sure to add the required path files to the .bashrc file as described in the installation tutorial.  This makes it infinitely easier to run SITL.

*Ardupilot and SITL simulates a quadcopter AND an environment.  Granted, this environment is limited to very basic parameters (location, wind speed, etc...).  It cannot simulate obstacles by itself (other than the ground I guess).

Blender to Gazebo

After obtaining the .stl and .dae files, copy them, along with the files that were used to make them (the DEM file, and the image file used for the texture) into the same folder as your .world file.  Then, you can add them to your .world file in the following manner:

<model name="my_mesh">
    <pose>0 0 0  0 0 0</pose>
    <static>true</static>
    <link name="body">
 <collision name="collision">
    <geometry>
      <mesh><uri>file:///usr/share/gazebo-9/worlds/uofu_labeless_aligned_truenorth.stl</uri></mesh>
    </geometry>
 </collision>
 <visual name="visual">
   <geometry>
     <mesh><uri>file:///usr/share/gazebo-9/worlds/uofu_labeless_aligned_truenorth.dae</uri></mesh>
   </geometry>
 </visual>
      </link>
</model>

Notice how the .stl file is used for detecting collisions with the quadcopter (or other models), and the .dae file provides the visual representation of the mesh.  Neither can do the job of the other, but they should line up pretty well, so it shouldn't present much of a problem.

After this, you will probably have to change the Quadcopter's spawn position so that it is still on top of the mesh.  But you are essentially done setting up your environment!  Congratulations!

DEM Files

DEM (Digital Elevation Model) files are exactly what they sound like: they hold elevation models.  These elevation models correspond to areas that have been mapped in a digital format for 3D reconstruction.  They usually have a .tif or .dem file extension (we use .tif files generally).

These are usually collected by the government or private corporations using LIDAR attached to planes.  These planes fly over the area, gathering the elevation of the terrain using LIDAR.  Lucky for us, many of these datasets are open to the public and published on sites like 

http://opentopography.org/ 
https://gis.utah.gov/data/elevation-and-terrain/

These sites usually contain a GUI or command line interface that allow you to download a DEM corresponding to the particular area that you are interested in.

For a great tutorial on DEM files, see http://gazebosim.org/tutorials?tut=dem .  However, note that for our project we do not import the .dem files directly in Gazebo, as they do not easily allow us to change its texture, or its orientation/rotation.  

In my experience, I would download a .dem file, and notice after importing them into Gazebo that its coordinate system was 90 degrees off from Gazebo's inherent system.  So north in Gazebo could be east in the heightmap. This was a problem because heightmaps cannot be rotated in Gazebo (as far as I could tell, and according to this site https://bitbucket.org/osrf/gazebo/issues/2167/not-possible-to-change-the-pose-of).

Once you have your dem file, you will need to get a Google Earth image corresponding to the same area.  It might be a good idea to follow the tutorial mentioned above, not to get a final project, but to visualize the area in Gazebo.  On the other hand, if you still have the GUI still open, you could use that to help "guesstimate" the area you need.  In my case, I simply used the snipping/screenshot tool to save a label-less Google Earth image of the area corresponding to the part of campus that I wanted to use.

From there, you are going to need blender.

Blender: Environment Creation

Blender is a "free and open source 3D creation suite" and is often used for animations or 3D modeling for 3D printers.  In our case, we use a special plugin to import a DEM file into Blender and turn it into a 3D mesh.  From there, we will export the mesh as both an .stl file, and a collada (.dae) file.


BlenderGIS Plugin


The plugin we use, and many great tutorials on how to use it can be found here: 
https://github.com/domlysz/BlenderGIS

Follow the tutorial on how to add the plugin to Blender


Getting the DEM file into Blender


Once the plugin is in your copy of Blender, we want to File>Import>Georeferenced raster.  From there, see the options on the bottom left of the import screen.  We want to select the option Import raster "As DEM".  Be sure to NOT click the box "Apply on existing mesh".  Then, import your dem file.  You should see an image corresponding to your terrain and might look something like this:

Attaching Image to DEM mesh


Now, you will need to attach that Google Earth picture to the mesh you've just created.  You do this by looking to the right hand side of the screen for the icons that represent a camera, pictures, the earth, and others.  Scroll through those icons till you reach material and texture tabs (represented by a checkered sphere and checkered board respectively).  Add a material to the mesh, and make its texture the image you've got from Google earth.  You may need to find some Blender tutorials if you struggle on this part.

Be sure to select the 'shadeless' option in Materials>Shading, and to select the Viewport Shading>Material at the bottom middle part of the screen to see what it looks like in the preview window.


Lining up the Image


You will probably notice that the image is not perfectly lined up with your DEM heightmap.  Unfortunately, this cannot be fixed within Blender (as far as I could tell), so I had to use MATLAB to shift the image until it lined up correctly.  The Matlab script I used is below:

%%%%%%%%%%%%%%%%%%%%%%%%%%%%
image = imread('uofu_texture_labeless.PNG');
shifted_im = circshift(image,[0,40,0]);
imshow(shifted_im);
imwrite(shifted_im, 'uofu_texture_labeless_off40.PNG');
%%%%%%%%%%%%%%%%%%%%%%%%%%%%

Other file formats might work too.  I worked primarily with png files personally.

Anyways, you can use Matlab, Photoshop or whatever you want to line it up.  My guess and check through Matlab method only too 4 tries to get it right, but is probably not the cleanest way to line this up, and presents a serious problem when trying to automate this process.


Exporting the Mesh


Once you have lined up your image, you should save this blender file.  It will allow you to return and modify it.  Additionally, you can return to it if you discover that north in Gazebo doesn't line up with north in the mesh.  This solves the problem we had previously when trying to import the dem file directly into Gazebo. 

Once you have all of this done, export the mesh to .stl and .dae collada files.  These will be used within your .world file in Gazebo to import the mesh into your environment.