Thursday, July 2, 2015

Re: Method Question

Sorry to miss your post. There are a few thoughts about your proposal:
  1. Since the drone will be navigating in a known indoor environment, you should take advantage of that knowledge and integrate the geometry (or map) of the building into your navigation algorithm.
  2. The Inertial Measurement Unit (IMU) of the drone can definitely gives you valuable real-time information of navigation. Use it to control the drone motion should be part of the learning curve of drone programming and worth trying to solve our problem. We have used it indirectly (through third-party tools) before. Somehow it seems not very reliable, and the drone will drift gradually. However, it shouldn't prevent you from testing the idea under your direct control (i.e., your own program).
  3. The ultrasound sensor on the drone is currently used to determine the height, not the distance of obstacles in front of the drone. It is possible to hack the drone and add a ultrasound sensor (and other circuits such as Arduino board) pointing toward the front direction to detect objects.
  4. The camera on the drone is actually another powerful sensor which can provide a lot information. Just like human navigating around the indoor environment, we rely heavily on the vision. The color marks guided navigation is just the first step of our project. By putting color marks around the strategic locations of the building will ease the task of visual detection. It is a simplified, intermediate step to test the image processing and navigation. Once that have been done, we can explore more sophisticated algorithms for navigation without any marks, and even handling the moving obstacle avoidance. 

No comments:

Post a Comment