|
From: | Ori Pessach |
Subject: | Re: [Paparazzi-devel] What's the state of ardrone2_vision? |
Date: | Sun, 14 Dec 2014 11:39:39 -0700 |
What you describe is not in the repo at the moment but has been done before: a simple module.xml with the following start and stop functions:void gst_video_start(void){// Start GST plugiun// Normal video 320system("/opt/arm/gst/bin/gst-launch v4l2src device=/dev/video1 ! videorate ! 'video/x-raw-yuv,framerate=15/1' ! videoscale ! video/x-raw-yuv, width=320, height=240 ! dspmp4venc ! rtpmp4vpay config-interval=2 ! udpsink host=192.168.1.255 port=5000 &");}void gst_video_stop(void){// Stop GST-Pluginsystem("kill -9 `pidof gst-launch-0.10` &");}and raw:<raw>upload_extra:@echo "====================================VISION START/INSTALL======================================"ardron2.py startvision</raw>we have-ChristopheOn Sun, Dec 7, 2014 at 4:53 PM, Ori Pessach <address@hidden> wrote:Also, I couldn't figure out from the documentation how to load any modules other than the ardrone2_vision mjpeg module... It sounds like there's a way to include the ardrone2_gstreamer module in a paparazzi airframe file so that it sets everything up and starts a gstreamer pipeline when you press upload, but I couldn't figure out how to do it.Any tips?On Sun, Dec 7, 2014 at 7:54 AM, Christophe De Wagter <address@hidden> wrote:ardrone2_vision contains mainly computer vision code and is work in progress.ardrone2_gstreamer is quite ready and has what you ask for: there are a few examples of gstreamer chains and the upload button has all the needed code to upload images to the dsp and start it.-ChristopheOn Sat, Dec 6, 2014 at 10:58 PM, Ori Pessach <address@hidden> wrote:_______________________________________________Hi,I'm trying to get the vision framework to work, but something is amiss. I can get the mjpeg streaming module to work (by including it in the airframe file) but the image quality is quite poor, and there's a lot of latency when viewed with VLC. I'm guessing it's streamed over a TCP connection, which can do that.What I would like to do is try to run the H.264 encoder on the DSP using gstreamer, and save the video to a USB thumb drive on the vehicle, at the very least. This should give the drone similar functionality to what you would get with the Parrot firmware.As a first step, I'm trying to just run a pipeline to encode MPEG4 on the drone using gst-launch, (as described here: http://lists.gnu.org/archive/html/paparazzi-devel/2013-11/msg00184.html)This kinda works, after futzing with environment variables on the command line and stopping paparazzi (since with the MJPEG module, it grabs /dev/video1 and we can't have that now, can we?)So, seeing that I can get gst-launch to work on the drone, I should be able to write a paparazzi module using gstreamer to run the same pipeline and use it to replace the mjpeg module. Right?What I'm wondering about, though, is why the H.264 DSP based encoder doesn't get installed along with the MPEG-4 encoder. Is this intentional? Is it an oversight? Can I just copy it over, launch it in the same way that ardrone2.py launces the MPEG-4 encoder and expect it to work? Am I missing something?I would love it if someone who's familiar with the code and how it's supposed to work would comment on that.Thanks,Ori Pessach
Paparazzi-devel mailing list
address@hidden
https://lists.nongnu.org/mailman/listinfo/paparazzi-devel
_______________________________________________
Paparazzi-devel mailing list
address@hidden
https://lists.nongnu.org/mailman/listinfo/paparazzi-devel
_______________________________________________
Paparazzi-devel mailing list
address@hidden
https://lists.nongnu.org/mailman/listinfo/paparazzi-devel
_______________________________________________
Paparazzi-devel mailing list
address@hidden
https://lists.nongnu.org/mailman/listinfo/paparazzi-devel
[Prev in Thread] | Current Thread | [Next in Thread] |