I'm trying to get the vision framework to work, but something is amiss. I can get the mjpeg streaming module to work (by including it in the airframe file) but the image quality is quite poor, and there's a lot of latency when viewed with VLC. I'm guessing it's streamed over a TCP connection, which can do that.
What I would like to do is try to run the H.264 encoder on the DSP using gstreamer, and save the video to a USB thumb drive on the vehicle, at the very least. This should give the drone similar functionality to what you would get with the Parrot firmware.
This kinda works, after futzing with environment variables on the command line and stopping paparazzi (since with the MJPEG module, it grabs /dev/video1 and we can't have that now, can we?)
So, seeing that I can get gst-launch to work on the drone, I should be able to write a paparazzi module using gstreamer to run the same pipeline and use it to replace the mjpeg module. Right?
What I'm wondering about, though, is why the H.264 DSP based encoder doesn't get installed along with the MPEG-4 encoder. Is this intentional? Is it an oversight? Can I just copy it over, launch it in the same way that ardrone2.py launces the MPEG-4 encoder and expect it to work? Am I missing something?
I would love it if someone who's familiar with the code and how it's supposed to work would comment on that.