Comparison of 2.4 GHz WiFi, 5 GHz WiFi and 100 MBit LAN

Extract

Up to now I have been using an Edimax EW-7812UAn V2 USB WiFi Dongle.  This has been very reliable and reasonably simple to install and configure.  The only problem is that it operates on the 2.4 GHz frequency band, which means it will interfere with the majority of modern radio control systems.  Up to now I have been using a 35 MHz RC system which was fine as long as I was flying on my own.  However, I’d like to fly with others who will be using 2.4 GHz systems.

There are now a number of Dual Band Wifi Dongles that operate in the 5 GHz band as well as at 2.4 GHz. This provides a solution so long as they work with the Raspberry Pi.  A couple of candidates were obtained, configured and tested on my Model B Raspberry Pi.  For an extra comparison I also ran a test over the wired LAN interface.

The results were interesting in that there was no clear winner in the speed stakes, there was however a clear loser.  What is clear is that useable 5 Ghz WiFi is possible but for now it’s not plug and play.

Setup

For these tests I used the following network devices.

  1. Raspberry Pi Model B Ethernet (100 Mbit/s)
  2. Tenga W522U Dual Band (2.4 GHz & 5 GHz)
  3. Edimax EW-7612UAn V2 (2.4 GHz)
  4. Edimax AC600 Dual Band (2.4 GHz & 5 GHz)

I setup my Model B, with the latest release of Raspbian (2015-01-31) and the current scripts.

5 GHz WiFi Prerequisites

To get 5GHz WiFi working you need to install the Central Regulatory Domain Agent (crda).  Basically, the 5 GHz band is more tightly regulated worldwide and you need to specify your regulatory domain so that hostapd knows which channels are available.

To install the necessary packages use:

sudo apt-get install hostapd crda iw

Then edit /etc/default/crda and set the REGDOMAIN variable to match your region.  Examples are US, JP, EU and in my case UK.

crda

Devices

The amount of work needed to setup networking varied by device.  I’m not going to give a detailed description of configuring each device.  I’ll do that in separate posts.

1. LAN

This simply required plugging the Pi and the Laptop into a switch and running ipconfig on the receiving device to get its IP address.  This was then used in remote.conf to direct the UDP stream.

2. Tenga W522U

The W522U is nl80211 compatible.  This means it works with a standard install of hostapd.  No special configuration is needed.

3. Edimax EW-7612UAn V2 (2.4 GHz)

The EW-7612UAn V2 uses the RTL8191SU chipset.  The kernel supports this chipset for client networking, but not for AP mode.  However, Realtek provide a custom version of hostapd that needs to be downloaded, compiled and installed which does work.

4. Edimax AC600 Dual Band (2.4 GHz & 5 GHz)

The AC600 is not supported by the latest kernel.  All is not lost, it just takes a little more work.  You need to download the kernel source code and the code for a Realtek 8812U driver.  Then you need to install the latest version of GCC from the Jessie repository.  This lets you build the kernel module.  Finally you need the same hostapd build as for the Edimax EW-7612UAn V2 device.

Hostapd Configuration

There are four combinations of hostapd.conf depending on the driver and band.  I’ve set them out in the table below.

Hostapd Version 2.4 GHz 5 GHz
nl80211
interface=wlan0
driver=nl80211
ssid=Pi_AP

hw_mode=g
channel=13
macaddr_acl=0 
auth_algs=1 
ignore_broadcast_ssid=0 
wpa=2 
wpa_passphrase=Raspberry 
wpa_key_mgmt=WPA-PSK 
wpa_pairwise=TKIP 
rsn_pairwise=CCMP
interface=wlan0
driver=nl80211
ssid=Pi_AP

hw_mode=a
channel=44
ht_capab=[HT40+]
ieee80211n=1
macaddr_acl=0 
auth_algs=1 
ignore_broadcast_ssid=0 
wpa=2 
wpa_passphrase=Raspberry 
wpa_key_mgmt=WPA-PSK 
wpa_pairwise=TKIP 
rsn_pairwise=CCMP
Realtek
interface=wlan0 
driver=rtl871xdrv
ctrl_interface=/var/run/hostapd

ssid=Pi_AP
channel=13

beacon_int=100
hw_mode=g
ieee80211n=1
wme_enabled=1
wpa=2
wpa_passphrase=Raspberry
wpa_key_mgmt=WPA-PSK 
wpa_pairwise=CCMP 
max_num_sta=8 
wpa_group_rekey=86400
interface=wlan0
driver=rtl871xdrv 
ctrl_interface=/var/run/hostapd
ssid=Pi_AP
channel=44
beacon_int=100 
hw_mode=a 
ieee80211n=1 
wme_enabled=1 
ht_capab=[SHORT-GI-20][SHORT-GI-40][HT40+]
wpa=2
wpa_passphrase=Raspberry

wpa_key_mgmt=WPA-PSK 
wpa_pairwise=CCMP 
max_num_sta=8 
wpa_group_rekey=86400

Procedure

These tests were a straight comparison between the four devices using the same settings and the same testing methodology as previously.  The resolution was 1280 x 720 pixels with a 6Mbps bit rate.

Results

The table below shows the resulting latency by device.

Device Latency
Edimax EW-7612UAn V2 – 2.4 Ghz 132 ms
Tenga W522U – 2.4 GHz & 5 GHz > 10 s
Edimax AC600 – 2.4 GHz 122 ms
Edimax AC600 – 5 GHz 126 ms
LAN 124 ms

Here they are as a graph.

WiFiAdapters

Analysis

The big surprise is how bad the Tenga W522U adapter was.  The latency was not just two or three times higher, it was two order of magnitudes higher.  There are a number of threads on various forums where other people have found similar issues, although not to this extent, but then they are probably not swamping the feed with a continuous video stream.  Whether this is just a compatibility issue with the Pi remains to be seen.

The Edimax AC600 was 10 ms faster than the previously used Edimax 7612UAn V2 at 2.4 GHz.  At 5GHz it was still faster, but only by 6ms.  Interestingly, the LAN connection was not any faster than the AC600.  On the Pi the LAN connection is 100 MBit/s as is the USB bus, so although the AC600 is theoretically capable of 433 Mbit/s it is never going to manage more than 100.

Conclusion

What we can take away from this test is that there is a limit to how much the latency can be reduced by changing the WiFi adapter.  It is a shame that to get a 5 GHz WiFi link requires a lot of compiling and configuration.  Hopefully this will improve with time.  For now we have another small reduction in the video latency to add to the current optimisations.

Adendum

Just as these tests were being done, the new Raspberry Pi 2 was released promising a x1.5 to x6 speed increase.  As I had used the latest release of Raspbian, I swapped the microSD card from from the Model B to the Model 2B.  What I found was that the Edimax AC600 was no longer detected and I couldn’t find a way to compile the module so that it would.  So that’s my next task.  I may run a quick comparison of the B versus the 2B with the Edimax 7612UAn V2 to see how much improvement the new processor has made.

Advertisements

Raspivid v Gst-rpicamsrc (Updated)

Introduction

User bocorps pointed me in the direction of gst-rpicamsrc.  This is “… a GStreamer wrapper around the raspivid/raspistill functionality of the RaspberryPi, providing a GStreamer source element capturing from the Rpi camera.”

What this means is that instead of piping the output of raspivid into gstreamer, gstreamer has a source element to read the camera directly.  This is similar to using the video4linux (v4l) source element, but negates the need for a v4l driver.

My hope was that by integrating the camera functionality into a gstreamer source element the latency would be reduced.  Unfortunately, I actually saw an 18% increase in latency.

Installation

Before I could use the gst-rpicamsrc element, I needed to download the source and build it.  As I was working with a minimal install of Raspbian Jessie, I needed to install the git package before I could do anything else

sudo apt-get install git

With git installed I could download the latest sources for gst-rpicamsrc.

git clone https://github.com/thaytan/gst-rpicamsrc.git

With that done a look in the REQUIREMENTS file indicated what other packages were needed in order to accomplish the build.

sudo apt-get install autoconf automake libtool libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev libraspberrypi-dev

Finally, I was able to complete the build and install.

./autogen --prefix=/usr --libdir=/usr/lib/arm-linux-gnueabihf/
make
sudo make install

The command `gst-inspect-1.0 rpicamsrc’ produces a list of the available parameters.

Factory Details:
  Rank                     none (0)
  Long-name                Raspberry Pi Camera Source
  Klass                    Source/Video
  Description              Raspberry Pi camera module source
  Author                   Jan Schmidt <jan@centricular.com>

Plugin Details:
  Name                     rpicamsrc
  Description              Raspberry Pi Camera Source
  Filename                 /usr/lib/arm-linux-gnueabihf/gstreamer-1.0/libgstrpicamsrc.so
  Version                  1.0.0
  License                  LGPL
  Source module            gstrpicamsrc
  Binary package           GStreamer
  Origin URL               http://gstreamer.net/

GObject
 +----GInitiallyUnowned
       +----GstObject
             +----GstElement
                   +----GstBaseSrc
                         +----GstPushSrc
                               +----GstRpiCamSrc

Pad Templates:
  SRC template: 'src'
    Availability: Always
    Capabilities:
      video/x-h264
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
          stream-format: byte-stream
              alignment: au
                profile: { baseline, main, high }


Element Flags:
  no flags set

Element Implementation:
  Has change_state() function: gst_base_src_change_state

Element has no clocking capabilities.
Element has no URI handling capabilities.

Pads:
  SRC: 'src'
    Implementation:
      Has getrangefunc(): gst_base_src_getrange
      Has custom eventfunc(): gst_base_src_event
      Has custom queryfunc(): gst_base_src_query
      Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
    Pad Template: 'src'

Element Properties:
  name                : The name of the object
                        flags: readable, writable
                        String. Default: "rpicamsrc0"
  parent              : The parent of the object
                        flags: readable, writable
                        Object of type "GstObject"
  blocksize           : Size in bytes to read per buffer (-1 = default)
                        flags: readable, writable
                        Unsigned Integer. Range: 0 - 4294967295 Default: 4096 
  num-buffers         : Number of buffers to output before sending EOS (-1 = unlimited)
                        flags: readable, writable
                        Integer. Range: -1 - 2147483647 Default: -1 
  typefind            : Run typefind before negotiating
                        flags: readable, writable
                        Boolean. Default: false
  do-timestamp        : Apply current stream time to buffers
                        flags: readable, writable
                        Boolean. Default: true
  bitrate             : Bitrate for encoding
                        flags: readable, writable
                        Integer. Range: 1 - 25000000 Default: 17000000 
  preview             : Display preview window overlay
                        flags: readable, writable
                        Boolean. Default: true
  preview-encoded     : Display encoder output in the preview
                        flags: readable, writable
                        Boolean. Default: true
  preview-opacity     : Opacity to use for the preview window
                        flags: readable, writable
                        Integer. Range: 0 - 255 Default: 255 
  fullscreen          : Display preview window full screen
                        flags: readable, writable
                        Boolean. Default: true
  sharpness           : Image capture sharpness
                        flags: readable, writable
                        Integer. Range: -100 - 100 Default: 0 
  contrast            : Image capture contrast
                        flags: readable, writable
                        Integer. Range: -100 - 100 Default: 0 
  brightness          : Image capture brightness
                        flags: readable, writable
                        Integer. Range: 0 - 100 Default: 50 
  saturation          : Image capture saturation
                        flags: readable, writable
                        Integer. Range: -100 - 100 Default: 0 
  iso                 : ISO value to use (0 = Auto)
                        flags: readable, writable
                        Integer. Range: 0 - 3200 Default: 0 
  video-stabilisation : Enable or disable video stabilisation
                        flags: readable, writable
                        Boolean. Default: false
  exposure-compensation: Exposure Value compensation
                        flags: readable, writable
                        Integer. Range: -10 - 10 Default: 0 
  exposure-mode       : Camera exposure mode to use
                        flags: readable, writable
                        Enum "GstRpiCamSrcExposureMode" Default: 1, "auto"
                           (0): off              - GST_RPI_CAM_SRC_EXPOSURE_MODE_OFF
                           (1): auto             - GST_RPI_CAM_SRC_EXPOSURE_MODE_AUTO
                           (2): night            - GST_RPI_CAM_SRC_EXPOSURE_MODE_NIGHT
                           (3): nightpreview     - GST_RPI_CAM_SRC_EXPOSURE_MODE_NIGHTPREVIEW
                           (4): backlight        - GST_RPI_CAM_SRC_EXPOSURE_MODE_BACKLIGHT
                           (5): spotlight        - GST_RPI_CAM_SRC_EXPOSURE_MODE_SPOTLIGHT
                           (6): sports           - GST_RPI_CAM_SRC_EXPOSURE_MODE_SPORTS
                           (7): snow             - GST_RPI_CAM_SRC_EXPOSURE_MODE_SNOW
                           (8): beach            - GST_RPI_CAM_SRC_EXPOSURE_MODE_BEACH
                           (9): verylong         - GST_RPI_CAM_SRC_EXPOSURE_MODE_VERYLONG
                           (10): fixedfps         - GST_RPI_CAM_SRC_EXPOSURE_MODE_FIXEDFPS
                           (11): antishake        - GST_RPI_CAM_SRC_EXPOSURE_MODE_ANTISHAKE
                           (12): fireworks        - GST_RPI_CAM_SRC_EXPOSURE_MODE_FIREWORKS
  metering-mode       : Camera exposure metering mode to use
                        flags: readable, writable
                        Enum "GstRpiCamSrcExposureMeteringMode" Default: 0, "average"
                           (0): average          - GST_RPI_CAM_SRC_EXPOSURE_METERING_MODE_AVERAGE
                           (1): spot             - GST_RPI_CAM_SRC_EXPOSURE_METERING_MODE_SPOT
                           (2): backlist         - GST_RPI_CAM_SRC_EXPOSURE_METERING_MODE_BACKLIST
                           (3): matrix           - GST_RPI_CAM_SRC_EXPOSURE_METERING_MODE_MATRIX
  awb-mode            : White Balance mode
                        flags: readable, writable
                        Enum "GstRpiCamSrcAWBMode" Default: 1, "auto"
                           (0): off              - GST_RPI_CAM_SRC_AWB_MODE_OFF
                           (1): auto             - GST_RPI_CAM_SRC_AWB_MODE_AUTO
                           (2): sunlight         - GST_RPI_CAM_SRC_AWB_MODE_SUNLIGHT
                           (3): cloudy           - GST_RPI_CAM_SRC_AWB_MODE_CLOUDY
                           (4): shade            - GST_RPI_CAM_SRC_AWB_MODE_SHADE
                           (5): tungsten         - GST_RPI_CAM_SRC_AWB_MODE_TUNGSTEN
                           (6): fluorescent      - GST_RPI_CAM_SRC_AWB_MODE_FLUORESCENT
                           (7): incandescent     - GST_RPI_CAM_SRC_AWB_MODE_INCANDESCENT
                           (8): flash            - GST_RPI_CAM_SRC_AWB_MODE_FLASH
                           (9): horizon          - GST_RPI_CAM_SRC_AWB_MODE_HORIZON
  image-effect        : Visual FX to apply to the image
                        flags: readable, writable
                        Enum "GstRpiCamSrcImageEffect" Default: 0, "none"
                           (0): none             - GST_RPI_CAM_SRC_IMAGEFX_NONE
                           (1): negative         - GST_RPI_CAM_SRC_IMAGEFX_NEGATIVE
                           (2): solarize         - GST_RPI_CAM_SRC_IMAGEFX_SOLARIZE
                           (3): posterize        - GST_RPI_CAM_SRC_IMAGEFX_POSTERIZE
                           (4): whiteboard       - GST_RPI_CAM_SRC_IMAGEFX_WHITEBOARD
                           (5): blackboard       - GST_RPI_CAM_SRC_IMAGEFX_BLACKBOARD
                           (6): sketch           - GST_RPI_CAM_SRC_IMAGEFX_SKETCH
                           (7): denoise          - GST_RPI_CAM_SRC_IMAGEFX_DENOISE
                           (8): emboss           - GST_RPI_CAM_SRC_IMAGEFX_EMBOSS
                           (9): oilpaint         - GST_RPI_CAM_SRC_IMAGEFX_OILPAINT
                           (10): hatch            - GST_RPI_CAM_SRC_IMAGEFX_HATCH
                           (11): gpen             - GST_RPI_CAM_SRC_IMAGEFX_GPEN
                           (12): pastel           - GST_RPI_CAM_SRC_IMAGEFX_PASTEL
                           (13): watercolour      - GST_RPI_CAM_SRC_IMAGEFX_WATERCOLOUR
                           (14): film             - GST_RPI_CAM_SRC_IMAGEFX_FILM
                           (15): blur             - GST_RPI_CAM_SRC_IMAGEFX_BLUR
                           (16): saturation       - GST_RPI_CAM_SRC_IMAGEFX_SATURATION
                           (17): colourswap       - GST_RPI_CAM_SRC_IMAGEFX_COLOURSWAP
                           (18): washedout        - GST_RPI_CAM_SRC_IMAGEFX_WASHEDOUT
                           (19): posterise        - GST_RPI_CAM_SRC_IMAGEFX_POSTERISE
                           (20): colourpoint      - GST_RPI_CAM_SRC_IMAGEFX_COLOURPOINT
                           (21): colourbalance    - GST_RPI_CAM_SRC_IMAGEFX_COLOURBALANCE
                           (22): cartoon          - GST_RPI_CAM_SRC_IMAGEFX_CARTOON
  rotation            : Rotate captured image (0, 90, 180, 270 degrees)
                        flags: readable, writable
                        Integer. Range: 0 - 270 Default: 0 
  hflip               : Flip capture horizontally
                        flags: readable, writable
                        Boolean. Default: false
  vflip               : Flip capture vertically
                        flags: readable, writable
                        Boolean. Default: false
  roi-x               : Normalised region-of-interest X coord
                        flags: readable, writable
                        Float. Range:               0 -               1 Default:               0 
  roi-y               : Normalised region-of-interest Y coord
                        flags: readable, writable
                        Float. Range:               0 -               1 Default:               0 
  roi-w               : Normalised region-of-interest W coord
                        flags: readable, writable
                        Float. Range:               0 -               1 Default:               1 
  roi-h               : Normalised region-of-interest H coord
                        flags: readable, writable
                        Float. Range:               0 -               1 Default:               1

Usage

Because of the way gstreamer works, the parameters for the feed needed to be split and re-arranged in the streamer pipeline. Previously all the parameters are specified as part of raspivid.

/opt/vc/bin/raspivid -t $DURATION -w $WIDTH -h $HEIGHT -fps $FRAMERATE -b $BITRATE -n -pf high -o - | gst-launch-1.0 -v fdsrc ! 

GStreamer parameters like width, height and frame rate are configured through capabilities (caps) negotiation with the next element. Other parameters like the bit rate and preview screen are controlled as part of the source element.

gst-launch-1.0 rpicamsrc bitrate=$BITRATE preview=0 ! video/x-h264,width=$WIDTH,height=$HEIGHT,framerate=$FRAMERATE/1 !

The new stream script is

#!/bin/bash

source remote.conf

if [ "$1" != "" ]
then
  export FRAMERATE=$1
fi

NOW=`date +%Y%m%d%H%M%S`
FILENAME=$NOW-Tx.h264

gst-launch-1.0 rpicamsrc bitrate=$BITRATE preview=0 ! video/x-h264,width=$WIDTH,height=$HEIGHT,framereate=$FRAMERATE/1,profile=high ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=$RX_IP port=$UDPPORT

Tests

This test was a straight comparison between the old and new scripts using the same settings and the same testing methodology as previously.  The resolution was 1280 x 720 pixels with a 6Mbps bitrate.

Update : Since the original article was published use Jan Schmitt spotted that I had misspelled “framerate” as “framereate” in the gst-rpicamsrc script.  He also suggested I should try using the baseline profile and a queue element to decouple the video capture from the UDP transmission.  With this in mind I have re-run the tests.

Results

Almost immediately I had the feeling that gst-rpicamsrc has slower.  Analysis of the video showed I was correct.  The latency using gst-rpicamsrc was 18% higher than using raspivid.

Update:

Running the original erroneous script with debugging on showed that the capture was running at 30fps instead of the intended 48 fps. Here are the new results averaged from 10 cycles.

Script gst-rpicamsrc @ 48 fps
raspivid @ 48 fps
Profile No queue With queue No queue With queue
baseline 184.2 175.4 153.9 156.9
high 186.2 185 154.1 159.7

gst-rpicamsrc @ 30 fps, high profile, no queue = 198.2 ms

Analysis

The first thing to note is that the raspivid latency (no queue, high profile) has risen from the 126ms found in the last tests to 154ms.  The only difference was that I cloned the Sandisk microSDHC card onto a Transcend 8GB.  I’ll set up some more tests to compare the cards.  As these tests were run from the same card and from the same boot, they are still valid for comparison.

It is immediately obvious that the gst-rpicamsrc latency is about 20% higher than the raspivid script, so the conclusion from the first publish of this article still stands.

What can be added is that using the baseline profile, does reduce the latency a little: 1 to 3ms in most cases.

Adding a queue element does provide a benefit for the gst-rpicamsrc script, especially with the baseline profile where a 9ms reduction in latency was observed.  For the raspivid script adding a queue element actually increased the latency by 3 to 4ms.  I suspect this is because the video stream is already decoupled from gstreamer by being piped in from an external process.

Conclusion

Using gst-rpicamsrc provides no benefit for reducing latency over raspivid.  That is not to say gst-rpicamsrc provides no other benefits.  For any use other than FPV, I would definitely use gst-rpicamsrc instead of having to pipe the video in through stdin.  It provides plenty of options for setting up the video stream as the command `gst-inspect-1.0 rpicamsrc’ above showed.

The problem here is that I am targeting this development for FPV use where low latency is the driving factor. At the moment my lowest latency for a adequate quality HD stream is 125ms and I really need to get this under 100ms to compete with current analog standard definition systems.  Whether it is possible to shave of another 25ms remains to be seen.

Update: Following the additional tests I would add that it is better to use the baseline profile over the high profile.

Effect of Frame & Bit rates on Latency

Extract

Back in March, an update to the Raspberry Pi camera software introduced some new video modes with higher frame rates for resolutions below full 1080p HD.  For 720p HD, frame rates up to 49 fps are now available and for VGA (640×480) there are new modes for 60 and 90 fps.  There were reports that, at these higher frame rates, the latency was reduced.  This didn’t come as a surprise as it has been known for a while that the GoPro video out has lower latency at 48 fps than at 30 fps.  The reasons for this are not totally clear, but is thought to be due to the differences between frames being less at the higher frame rates allowing for more efficient compression.

In order to investigate this I ran some back-to-back tests at various frame rates.  I was expecting that with twice as many frames being squeezed into the same bit rates, the video quality would suffer, so the tests where repeated at three different bit rates.  The result of this testing was a 25% reduction in latency, with a new minimum of 121 milliseconds at 720p and 48 fps.  As expected the video quality at the old bit rate of 2.5Mbps suffered for fast moving backgrounds, but even when increased to 6.0Mbps, the latency was still only 126 Mbps; a 23% reduction.

With VGA resolution at 60fps and 2.5 Mbps the latency dropped to 87 ms and while this is useful, in terms of hardware size, convenience and even latency, it is still bettered by dedicated FPV hardware.  If you include the cost of the laptop, it looses out on price too.  Where the Pi has the advantage is at the 1280 x 720p HD resolution where there is currently no affordable competition.

Setup

The test environment was set up the same as previously, with the camera pointing at the laptop screen.  The camera was positioned so that at least ten iterations of the image were visible at once. I used a single LED torch as the measurement indicator.  The tests were run with no overclocking.

I used a UDP stream with some improved scripts to simplify running all the variations. The scripts can be found on the Current Scripts Page.

A GoPro 3 Black was used to record the screen at 120 fps.

Tests were run at frame rates of 25, 30, 36, 42, 45, 48 and 49 fps and at bit rates of 2.5, 4.5 and 6.0 Mbps.   Up to 5 LED on/led off cycles were recorded on the GoPro for later analysis.  With 10 iteration for each of the 5 on/offs, the calculated latency represents the average of 100 tests.

Results 1 – Latency

Across all the bit rates, the results show a definite reduction in latency with increasing frame rate right up to 48 frames per second. At 49 fps the latency increased slightly. The lowest latency achieved was 121ms at 48 fps for the 2.5 Mbps stream.  A reduction of 25% over the 25 fps result.  The 6.0 Mps stream still managed a reduced latency of 126 ms at 48 fps and the 4.5 Mbps stream achieved 125 ms.

The VGA test showed a much lower latency at 87ms, even though it was using the same bit rate as the smallest 720p HD stream.

The graph shows some odd behavior between the 4.5 and 6.0 Mbps curves.  Below about 40 fps the 4.5 Mbps stream has higher latency than the 6.0 Mbps stream.  Beyond 40 fps the curves switch over to what you would expect.

fps

In addition to the average latency, the graph below shows the minimum and maximums for each bit rate.  What is apparent is that there is a lot of overlap between the results and that the latency can vary +/- 20 fps, particularly at the lower bit rates.

fps2

Results 2 – Video Quality

The frame grabs below show a clear difference between the three bit rates at 48 fps. The 2.5 Mbps stream shows extreme pixelation and a general blurring of the image.

Quality25

Frame grab from 2.5 Mbps stream @ 48 fps

The  4.5 Mbps shows a reduction in the pixelation.  This is apparent in the sky and on the tiled roof.

Quality45

Frame grab from 4.5 Mbps Stream @ 48 fps

The 6.0 Mbps is the clear winner, with a minimum of pixelation and the sharpest image.

Quality60

Frame grab from 6.0 Mbps Stream @ 48 fps

Conclusion

The total latency is made up of capture, compression, transmission and display components.  The capture and  display components should be the same for the three bit rates as the resolution is the same.  The transmission component should increase in relation to the higher bit rates.  That just leaves the compression component.  The compressor has to work hard to squeeze the stream into a smaller bit rate whilst maintaining the best video quality possible.  For this reason decreasing the bit rate can increase the latency, although there is probably a point where the extreme pixelation starts reducing the latency again.

These opposing affects between compression and transmission are likely what has caused the odd behavior between the 4.5 Mbps and 6.0 Mbps streams.  At the lower frame rates, the compression delay for the 4.5 Mbps stream was more significant.  At the higher frame rates the transmission delay in the 6.0 Mbps stream became more significant.

While the VGA test used the same 2.5 Mbps bit rate as the HD test, with only a third of the pixels, the capture, compression and display components resulted in a much lower latency of 87 ms.

While the lowest latency was achieved for the 2.5 Mbps stream, the extreme pixelation of the image renders it pretty much unusable.  The 4.5 Mbps stream could be used but it only has a 1 ms advantage over the superior image quality of the 6.0 Mbps stream.  For this reason I plan to adopt the 48 fps, 6.0 Mbps, 126 ms stream as my new baseline when flying.

I will probably also switch to a 24 fps pipeline for youtube videos to get the best image quality from the 48 fps video, dropping every other frame.  I have seen this done where people have used the 48 fps recording on GoPro cameras.

B or B+

Extract

The Raspberry Pi Model B+ has recently been released with improved USB hardware.  This test aimed to determine if this new hardware would reduce the latency compared to the Model B.  And the answer I’m afraid is No.

FPiV Model B v B+

Setup

The new USB hardware requires different firmware from previously.  As I was going to have to rebuild the operating system anyway, I opted to try a minimal installation using raspbian-ua-netinst.  There was a second reason for doing this, which was to install the “Jessie” build of Raspbian rather than “Wheezy” because “Jessie” comes with the latest build of gstreamer as standard.  This is currently at version 1.4.1.  In “Wheezy” you only have access to version 1.0.

Two installs were completed using identical procedures. One on my old model B and one on a new B+.

The test environment was set up the same as previously, with the camera pointing at the laptop screen.  The camera was positioned so that at least ten iterations of the image was visible at once. I used the LED on my phone with a flashlight app for measurement indicator.  The tests were run with no overclocking.

I used a UDP stream using the same scripts as for the previous Transport Stream and Overclocking Test.

A GoPro 3 Black was used to record the screen at 120 fps.

Three identical tests were run.  One for each Raspberry Pi using the operating system that had been built on it and then a third test using the B+ micro SD card in the model B with an adapter.  A number of led on/led off cycles were recorded and then two cycles from each run were picked at random for analysis.

Results

The GoPro video was analyzed using VideoReDo H264 to establish the number of frames between the LED switch on (or off) and the tenth iteration switch on/off.  This value was then divided by 1.2 to get the latency in milliseconds.

  1. Model B, Native built OS : 156ms, 150ms, 138ms, 147 ms = an average of 147ms +/- 10ms
  2. Model B+, Native built OS : 153ms, 154ms, 157ms, 154ms = an average of 154 ms +/- 3ms
  3. Model B with B+ built OS : – 148ms, 142ms, 158ms, 148ms = an average of 149 ms +/- 8 ms

Conclusion

For now the Model B+ appears to give a slightly higher latency than the Model B.  This is a shame as the B+ has a much cleaner layout.  Hopefully, the latency will improve with time as the USB drivers are optimised.  The “Jessie” build of Raspbian has not been officially released yet.

The results for the Model B were slightly better than the previous Transport Stream and Overclocking Test by about 10 millseconds so it appears the minimal raspbian-ua-netinst build was worthwhile.  See this post for details on how to configure raspbian-ua-netinst for “Jessie”.

Transport Stream and Overclocking Comparison

Extract

Various issues have kept me grounded for a couple of months, but I have managed to get some bench testing done.  Two things I wanted to test were a comparison of TCP versus UDP for the transport stream and the effect of overclocking the Pi.  In both cases the results were disappointing with only minimal reductions in overall latency.

TransportAndOverclockLatency

Setup

The test environment was set up the same as previously, with the camera pointing at the laptop screen.  The camera was positioned so that at least ten iterations of the image was visible at once. I used the LED on my phone with a flashlight app for measurement indicator.  The tests were run with the no overclocking and then the maximum (Turbo) setting available in raspi-config.

The scripts for transmitting and receiving the video stream are shown below.

A GoPro 3 Black was used to record the screen at 120 fps.

TCP – Pi Transmitter Script

#!/bin/sh
cat $0

# Zero duration is continuous
DURATION=0
WIDTH=1280
HEIGHT=720
FRAMERATE=30
BITRATE=2500000
IP=$(ip -o addr show wlan0 | sed -n 's/.*inet \([0-9.]*\)\/.. .*/\1/p')
PORT=5000

/opt/vc/bin/raspivid -t $DURATION -w $WIDTH -h $HEIGHT -fps $FRAMERATE -b $BITRATE -n -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=$IP port=$PORT

TCP – Laptop Receiver and Recording Script

#!/bin/bash
source ./settings.conf

NOW=`date +%Y%m%d%H%M%S`
FILENAME=$NOW-Rx.ts

gst-launch-1.0 -v tcpclientsrc host=$IP port=$TCPPORT ! \
  gdpdepay ! rtph264depay ! \
  h264parse config-interval=96 ! \
  tee name=t ! queue ! \
  video/x-h264, framerate=25/1 ! avdec_h264 ! videoconvert ! autovideosink sync=false t. ! \
  queue ! mpegtsmux ! filesink location=$FILENAME

UDP – Pi Transmitter Script

#!/bin/sh
cat $0

# Zero duration is continuous
DURATION=0
WIDTH=1280
HEIGHT=720
FRAMERATE=48
BITRATE=2500000
IP=$(ip -o addr show wlan0 | sed -n 's/.*inet \([0-9.]*\)\/.. .*/\1/p')

#Edit the address below to match the receiving computer
TARGETIP=192.168.42.26

UDPPORT=5001
NOW=`date +%Y%m%d%H%M%S`
FILENAME=$NOW-Tx.h264

/opt/vc/bin/raspivid -t $DURATION -w $WIDTH -h $HEIGHT -fps $FRAMERATE -b $BITRATE -n -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink host=$TARGETIP port=$UDPPORT

UDP – Laptop Receiver and Recording Script

#!/bin/bash
source ./settings.conf

NOW=`date +%Y%m%d%H%M%S`
FILENAME=$NOW-Rx.ts

gst-launch-1.0 -v udpsrc port=$UDPPORT \
  caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' ! \
  rtph264depay ! h264parse config-interval=96 ! \
  tee name=t ! queue ! \
  video/x-h264, framerate=48/1 ! avdec_h264 ! videoconvert ! autovideosink sync=false t. ! \
  queue ! mpegtsmux ! filesink location=$FILENAME

settings.conf

#!/bin/bash
export IP=192.168.42.1
export TCPPORT=5000
export UDPPORT=5001
export FRAMERATE=25

Results

The GoPro video was analyzed in Sony Movie Studio 11 to establish the number of frames between the LED switch on and the tenth iteration switch on.  This value was then divided by 120 to get the number of seconds for ten iterations, divided by 10 to get the number of seconds per iteration and finally multiplied by 1000 to get the latency in milliseconds.

  1. TCP – No Overclock – 160 ms
  2. TCP – Turbo Overclock – 160 ms
  3. UDP – No Overclock – 157 ms
  4. UDP – Turbo Overclock – 155 ms

As you can see from the graph, there is very little difference between the values. Switching to UDP gives a 3ms reduction in latency over TCP and the Turbo overclock removes another 2ms.  TCP doesn’t seem to benefit from overclocking at all.

Conclusion

Based on these results, I plan to switch future testing to UDP.  I’m not going to bother overclocking though, prefering the lower current draw and better long term stability of no overclocking.

Raspberry FPiV – Flight 4 – 1, 2, Tree

Extract

The aim of this flight was to test the airframe and the electronics after the unplanned lawn dart into the mud.  It was also the first test of the CRIUS CN-06 V2 U-blox GPS module.  I didn’t bother to stream the video to the laptop.  Instead I used the JuiceSSH app on my phone to access the Raspberry Pi and start a local 1080p HD and GPS recording.

Setup

The Crius GPS cable terminates in a standard 0.1″ pitch, four pin socket.  The pin outs on the Raspberry Pi are spread out over five pins, so I borrowed a single socket housing from a PCB development jumper and re-arranged the connections as shown in the photo below.

2014-05-23 13.11.08

The GPS was held on to the case with BluTak.

The only other change was to mount the camera on an extension so I could point it further down and to the right.  The aim being to get the propeller out of shot.

I didn’t plan for any big tests on this flight so didn’t bother setting up the laptop.  Instead I used the JuiceSSH app on my phone to start recording the GPS and video output on the Pi using a short script.  As I wasn’t streaming the video, I bumped the resolution up to full 1080p.

#!/bin/sh
cat $0

# Zero duration is continuous
DURATION=0
WIDTH=1920
HEIGHT=1080
FRAMERATE=25
BITRATE=10000000
NOW=`date +%Y%m%d%H%M%S`
FILENAME=$NOW-Pi.h264
gpxlogger -i 2 -f $NOW-Pi.gpx &

/opt/vc/bin/raspivid -t $DURATION -w $WIDTH -h $HEIGHT -fps $FRAMERATE -b $BITRATE -n -o $FILENAME

The Flight

Almost immediately I felt something wasn’t working properly.  Whilst the model was flying at normal speed, the climb rate was very low.  The wind wasn’t as strong as previously, but there still seemed to be some significant turbulence tossing the plane around.  After only one and a half circuits it started descending whilst over the far boundary and landed in a tree.

PlaneStuckInATree

The only visible damage was a broken tailplane and a broken prop.  I still had a WiFi connection from my phone to the Pi and was able to stop the recording and shut it down.  I had a 4m long strap in the car with a heavy buckle on one end, but it wasn’t long enough to reach the plane.  As this was lunchtime, I eventually had to leave it up the tree and return to work.

After work I stopped in at a local hardware store and bought 15m of chord and a pack of 10mm shackles.  The plane was still up the tree when I got back to the field, so I tied two of the shackles chord and started launching it skywards.  I soon found out I needed to tie some shackles to the loose end to stop the whole lot ending up in the tree.  After about 10 minutes, I managed to get the chord over the fuselage and pull it free of the branches.

After disconnecting the battery, a more detailed examination revealed the only other damage as a dent in the nose.  As the ESC had been connected for five hours, the battery had continued to drain and when checked was well past the minimum voltage.  One of the cells was reading 2.15V, so that battery will be going for recycling.

Post-Mortem

I was able to extract the video and gps files from the SD card.  It turns out the propeller was still partly visible in the video and because of this I could see the low voltage cutout had activated just as the plane crossed the far boundary. This is why it started descending and ended up in the tree.  Why the battery went flat so quickly was another matter, as was the lack of power and the inability to climb.

Once I got everything on the bench I found the motor felt a bit rough.  I can only assume that grit had got into the bearings after the mud bath.  This could account for the increased current draw with the reduced power output.

Results

I think pointing the camera off to the side was a bad idea, even for a non FPV flight.

The one positive result was the CRIUS GPS.  There was an excellent correlation between the video and the GPS track, as the two pictures below show.

Flight4-PowerLine

Flight4-Trees

Additional benefits for the CRIUS are:

  • Doesn’t require a separate power supply as it doesn’t draw too much power from the Raspberry Pi.
  • It’s a compact all-in-one unit.
  • Freely available from eBay.

Conclusion

From now I shall be using the CRIUS GPS.

As the aircraft needs some rebuilding, I’m going to rework the internals so that I can mount the Raspberry Pi internally, with the camera above the nose.  I also need to build a new power pod the will mount above the wing.

As an alternative platform for some actual First Person View flying, I’m also building a tricopter.