paparazzi-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Paparazzi-devel] FPV app on AppStore for H.264 wifi video links


From: Gerard Toonstra
Subject: [Paparazzi-devel] FPV app on AppStore for H.264 wifi video links
Date: Mon, 18 Mar 2013 18:03:35 -0300

Hi there,

I'd like to announce something not immediately related to paparazzi *now*, 
although this may change in the future…


I've been working on an affordable, accessible and low-latency (HD) video 
downlink (for consumers and hobbyists) for a year or so
using my own resources. Software wise things have been going smoothly, but 
hardware wise there are still issues to resolve.
There's a forum post where a lot of research so far is documented on fpvlab.com:

http://fpvlab.com/forums/showthread.php?4869-The-complexities-of-streaming-HD-video-downlinks…


The ground station software that decodes and shows the transmitted video is now 
stable enough that I released this as
an app on the AppStore. It's called "FPV" (FIrst Person View) as it targets 
mostly hobbyists, but newer features that I have 
in the pipeline may make this very interesting for attempts to integrating this 
with some paparazzi ground station modules or 
the autopilot itself. The FPV app decodes H.264 video over wifi and uses some 
Long Range wifi modules to get all the 
data across. (actually, in theory, paparazzi could already use a wifi link for 
telemetry using the w5100 module that I created, but that still
needs some work to remove the wait loops and other cleanups. ). There has been 
some other work done with Gumstix I believe.


The link to the App in the AppStore is here:

https://itunes.apple.com/br/app/fpv/id609127142?l=en&mt=12

A video showing the app with a video stream from a simulator is here:

http://www.youtube.com/watch?v=Y2LGJn8Whag

( the video stream is created by running the FlightGear simulator on Linux, 
recording the desktop in realtime and transferring this over 
my private network. The telemetry is taken from FlightGear as well and used to 
paint the OSD ).  It's also interesting to note that the OSD
at the top shows the "technical" info that you'd typically find on an OSD 
designed by an engineer. The "bottom" view is more of a 
view designed after Cognitive Work Analysis.

The manual, protocol description and more information on the hardware that I 
used for field testing is here:

http://www.radialmind.org



These new features mentioned are all about showing virtual cues in the streamed 
image. Think "virtual tunnel" to fly a specific trajectory manually, 
points of interest and how these pois could be generated by touching a display 
showing the same streamed image. 
The idea behind this is that spectators who're watching along with an FPV 
flight might want to request flying over
a specific area or point and how they'd communicate that exact point with the 
pilot. In more serious applications
you'd probably have a vehicle with an AP. It is then necessary to communicate 
effectively between team members on the ground.
One person is scanning a camera image for POIs and at some point needs to 
communicate his perceptions with the 
flight operator. How do you get this POI seen from a camera image quickly on a 
2D map so that navigation changes
can be made effectively?  

Rgds,

Gerard




reply via email to

[Prev in Thread] Current Thread [Next in Thread]