paparazzi-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Paparazzi-devel] What's the state of ardrone2_vision?


From: Ori Pessach
Subject: Re: [Paparazzi-devel] What's the state of ardrone2_vision?
Date: Sun, 7 Dec 2014 11:51:37 -0700

And one last question: I'm guessing that dspmp4venc is implemented in libgstdsp.so - where's the source of that library? I found a similarly named library in the Maemo source tree. Are the libraries related, and does libgstdsp.so also have an H.264 encoder?

As it is, the gstreamer component for the AR.Drone2 is a kinda broken. The only pipeline I could get to work was the mpeg4 video -> UDP one, and it only works at 320x240. 640x480 didn't work at all, and 1280x720 rebooted the drone after crashing it, presumably. It's conceivable that the MPEG4 encoder can't handle that resolution, but I'm not sure what's really happening.

Even at 320x240 there's a lot of latency and dropped frames. Something is not right with that code.

Ori

On Sun, Dec 7, 2014 at 7:54 AM, Christophe De Wagter <address@hidden> wrote:
ardrone2_vision contains mainly computer vision code and is work in progress.

ardrone2_gstreamer is quite ready and has what you ask for: there are a few examples of gstreamer chains and the upload button has all the needed code to upload images to the dsp and start it. 

-Christophe 

On Sat, Dec 6, 2014 at 10:58 PM, Ori Pessach <address@hidden> wrote:
Hi,

I'm trying to get the vision framework to work, but something is amiss. I can get the mjpeg streaming module to work (by including it in the airframe file) but the image quality is quite poor, and there's a lot of latency when viewed with VLC. I'm guessing it's streamed over a TCP connection, which can do that. 

What I would like to do is try to run the H.264 encoder on the DSP using gstreamer, and save the video to a USB thumb drive on the vehicle, at the very least. This should give the drone similar functionality to what you would get with the Parrot firmware. 

As a first step, I'm trying to just run a pipeline to encode MPEG4 on the drone using gst-launch, (as described here: http://lists.gnu.org/archive/html/paparazzi-devel/2013-11/msg00184.html

This kinda works, after futzing with environment variables on the command line and stopping paparazzi (since with the MJPEG module, it grabs /dev/video1 and we can't have that now, can we?)

So, seeing that I can get gst-launch to work on the drone, I should be able to write a paparazzi module using gstreamer to run the same pipeline and use it to replace the mjpeg module. Right?

What I'm wondering about, though, is why the H.264 DSP based encoder doesn't get installed along with the MPEG-4 encoder. Is this intentional? Is it an oversight? Can I just copy it over, launch it in the same way that ardrone2.py launces the MPEG-4 encoder and expect it to work? Am I missing something?

I would love it if someone who's familiar with the code and how it's supposed to work would comment on that.

Thanks,

Ori Pessach




_______________________________________________
Paparazzi-devel mailing list
address@hidden
https://lists.nongnu.org/mailman/listinfo/paparazzi-devel



_______________________________________________
Paparazzi-devel mailing list
address@hidden
https://lists.nongnu.org/mailman/listinfo/paparazzi-devel



reply via email to

[Prev in Thread] Current Thread [Next in Thread]