So, my Raspberry Pi camera board has arrived and I have started playing with it.
My first impressions were tiny, super cheap camera. Its low light capabilities are not great but I can live with that. The important bit is the quality, full 1080p at 25 frames per second (UK). That on its own is pritty awesome.
I was though only really interested in getting the pi to work remotely for my robot to replace the 3-4 fps standard def USB webcam I currently use. Currently though with the camera, there is no V4L driver so we have to make do with their provided applications then pipe it out to streaming applications. Their recommended method is slow and very laggy (netcat and mplayer with wifi provides 3-6 seconds of lag). So I had to find another option. I happened to be sitting on the #raspberrypi IRC on freenode and heard a user having success with gstreamer. So this is how to set up gstreamer to stream HD video with less than 0.5 seconds of lag.
First we need to add a repositary with gstreamer1.0
sudo nano /etc/apt/sources.list
and add to the end
deb http://vontaene.de/raspbian-updates/ . main
Then do an sudo apt-get update
next grab gstreamer
sudo apt-get install gstreamer1.0
On your recieving end you will also need gstreamer. Because I mainly use a mac, I decided to get it working on the mac so with help from arcanescu on IRC, we figured out how to get it working on mac os (10.8)
The simplest way is with brew, a package manager like apt-get, but for mac os. To install it run this in terminal on your mac simply run
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
It will grab and install brew. Now update it with
brew update
Now we need to grab gstreamer
brew install gstreamer gst-libav gst-plugins-ugly gst-plugins-base gst-plugins-bad gst-plugins-good
Once that installs you should be good to go. Enter
raspivid -t 999999 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=YOUR-PI-IP-ADDRESS port=5000
on your raspberry pi and enter on your mac
gst-launch-1.0 -v tcpclientsrc host=YOUR-PI-IP-ADDRESS port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false
The picture quality is good but I see a number of flickers which isnt a problem for me, if it is for you, maybe try adjusting the resolution
So thanks again to arcanescu, I take no credit for this as it was him that came up with this, if you ever see him in IRC, give him a virtual pat on the back
His blog garagedeveloper.wordpress.com
Thanks works great!
You might want to refer to the following page: http://wiki.matthiasbock.net/index.php/Hardware-accelerated_video_playback_on_the_Raspberry_Pi#gstreamer in order to be able to install gstreamer 1.0 🙂
Regards
Kristian Lauszus
i am getting no element avdec_h264 error
I tried to use sudo apt-get install gstreamer1.0 but that does not exist . Did you add another repo somewhere?
Thanks for a nice article
Thanks for pointing that out, updated the post with the extra repo
use sudo-apt get install gstreamer1.0-tools
"sudo apt-get install gstreamer1.0" didn't work for me. Did you add another repo?
Fixed
Hi
On the receiving end is there a gstreamer switch that will allow me to record the video stream? I can record on the sending end by piping raspvid through tee then to gstreamer but it would be very useful for my application to record the video at the receiver.
Thanks in advance and thanks for the great instructions.
Thanks for the excellent post. I got it to work as described and now I want to learn more about gstreamer…can you recommend any good tutorials for building gstreamer pipelines?
Many of the search results on-line point to resources for building an SDK.
Like David M. above, I’m looking to ‘tee’ the video to the screen and a filesink.
I am no expert on Gstreamer so the wrong person to ask. arcanescu knows what he is doing so you would be best to ask him via his blog or pop on the raspberrypi IRC channel (freenode) and hit him up there.
Thanks for this helpful article! I’m a bit of a n00b, so when you say ‘add … to the end’, do you mean when you go into the nano editor or before you do? the whole thing didnt work for me because I cant get past this basic hurdle!;)
Hi there,
I’m a newbie so bear with me… I did exactly as described. The streamer seems to run on the Pi but I get a message on my MAC: WARNING: erroneous pipeline: no element “rtph264depay”
What am I doing wrong?
Michael
Thank you ever so for you blog.Really thank you! Cool.
Hi, really nice tutorial thank you!
I managed to compile both on my mac and rasp pi, but when i start gstreamer on rasp pi camera led is on but it stucked on |Pipeline is prerolling”
Not original but true: Thanks
I managed to connect my laptop with Debian 7.1.0 to the Raspberry Pi/Raspbian using the Netcat connection. It woks fine but with the known delay.
I am trying to reproduce your experiment and seems everything goes well on the Raspberry side. However on my laptop it says:
Estableciendo el conducto a PAUSA …
No protocol specified
No protocol specified
No protocol specified
libEGL warning: DRI2: xcb_connect failed
No protocol specified
libEGL warning: DRI2: xcb_connect failed
libEGL warning: GLX: failed to load GLX
No protocol specified
No protocol specified
ERROR: El conducto no quiere pausarse.
ERROR: del elemento /GstXvImageSink:autovideosink0-actual-sink-xvimage: Could not initialise Xv output
Información adicional de depuración:
xvimagesink.c(1291): gst_xvimagesink_xcontext_get (): /GstXvImageSink:autovideosink0-actual-sink-xvimage:
Could not open display
Estableciendo el conducto a NULL …
Liberando la tubería…
Half in Spanish Half in English. The problems in Spanish are about the pipeline trying to go to PAUSE but not managed. So at the end it set it to NULL and release the pipe.
Please do you have any idea what could happen?
Thanks again / Borja
I would like to have several raspberry pis with each pi streaming video from its pi camera (just like in the above blog post) over wifi. Then a different pi receiving and displaying one of the streams and being able to switch between streams. The display pi could ssh into the camera pi that it wants and execute the above streamer (that’s one simple way to switch). But what would be the corresponding decoder and display that matches the above encoder? Just doing the above mac command on the display pi doesn’t seem to work. I have looked around for a “camera on one pi, display on another pi” example, but all I have found is pi-to-home-computer streaming. Side note: I will probably be writing a touch panel interface to select the stream.
I am hoping that the one-pi-to-another-pi setup is a fairly easy question…
Is it possible to use this over multicast UDP for multiple receivers?
Thank you for sharing! Very useful!
Regards,
Bruce
you need sudo to edit the sources.list with nano
hi, i have an error saying
WARNING: erroneous pipeline: no element “avdec_h264”
on my rip
and
ERROR: pipeline doesn’t want to preroll.
Setting pipeline to NULL …
Freeing pipeline …
what is the issue? thanks
Hi, regarding the flicker issue, I had that for a while, but as is common with the Pi it turned out to be a power problem. Not only do you need a good supply, but the USB cable is very often too thin in cheap cables; 28 gauge will cause flicker, and ethernet might stop working. 22 gauge should work, or just try different cheap cables and power supplies until it goes away.
Hi all,
I dont have a Macbook. How can I open the stream with a Liunx (Debian-7-3-0)-Laptop ?
Thanks for answering!
Simon
What kind of a stream does this produce ? I tried viewing it with VLC on a PC as rtsp and http, but VLC could not display it.
It’s it’s some kind of a special stream, how can I produce one that VLC understands ?
I’m also interested to know this.
did you find any solution for this?
When I do “sudo apt-get install gstreamer1.0” it can’t find the package. “sudo apt-get update” seemed to fail:
Ign http://vontaene.de . Release.gpg
Ign http://vontaene.de . Release
Fehl http://vontaene.de ./main armhf Packages
404 Not Found
Ign http://vontaene.de ./main Translation-de_DE
Ign http://vontaene.de ./main Translation-de
Ign http://vontaene.de ./main Translation-en
I have found the fault. I forgot the ‘p’ in “raspbian-updates”. Next time, i will try Copy-Paste with SSH. 😉
Hey,
Thanks for the tutorial! Very helpful but I have a question. For some reason it runs SUPER slow the third time that I’ve run it. Does gstreamer record and save the files or does it just stream them?
What resolution and bitrate was the video when you sent it over the network? 0.5ms latency sounds absolutely fantastic.
Thanks for guide! With your help I was able to get stream working and delay did not raise even that I ran camera full hd / full fps / 5.5Mbit/s.
I experienced that sending video over UDP did reduce delay even more. I was not expecting much difference between UDP / TCP on ethernet, but it really did seem reduce delay from around 500ms to somewhere around 100-200ms that was barely noticeable anymore.
To setup sending stream over UDP:
On OSX start first host to be ready to listed UDP packets:
gst-launch-1.0 -v udpsrc address=HOST_IP_ADDR port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false
Then start sending on raspi:
raspivid -t 0 -b 5500000 -n -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! udpsink port=5000 host=HOST_IP_ADDR
One thing I couldn’t get to work was being able to start receiver, after stream sending has already started… maybe someone knows gstreamer magic how to do that 🙂
Adding caps to reading side of stream helped to get started even in middle of UDP stream
gst-launch-1.0 -v udpsrc address=HOST_IP_ADDR port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false
For those who are interested what those caps mean here is explication: http://www.iana.org/assignments/rtp-parameters/rtp-parameters.xhtml
Also I removed one unnecessary filter pass from streaming side:
raspivid -t 0 -b 5500000 -n -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! udpsink port=5000 host=HOST_IP_ADDR
To store UDP stream to file one can replace stream reader with
gst-launch-1.0 -v udpsrc address=HOST_IP_ADDR port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! h264parse ! matroskamux ! filesink location=capture.avi
Hey, thx for this great tutorial. Now i have one problem. I want to implement the video-signal into the program live-view-rift on a windows-computer. Can u help me with the pipeline parameter. I didnt fount out which are the right ones. Only to type in the ip isnt working.
The link for downloading gstreamer on mac is not valid anymore, it is now:
ruby -e “$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)”
Thanks, updated.
i got this working in Windows so thanks for that.
however i can’t seem to find a way to play it in fullscreen. any ideas?
Hello
I have a wolfson audio card for my rev b and I would like to stream just audio to my Pi 2 using this method.. would you be able to assist me?
Thanks
If like me you were getting the error:
“libgstreamer1.0-0 is already the newest version.”
it’s because you don’t need to edit the file /etc/apt/sources.list. The package is already available in the Raspbian repository.
When I try to edit my source list files I get the following…
deb http://mirrordirector.raspbian.org/raspbian/ Jessie main contrib non-free r$
#Uncomment line below then ‘apt-get update’ to enable ‘apt-get source’#deb-src http://archive.raspbian.org/rasbian/jesse main contrib non-free rpi
I am stuck now because I don’ know how to do what they are asking me to do let alone executing the fix. I am brand new to the raspberry pi programing. I just wanted to make my camera stream over the internet. Can you help? I have no clue on how to uncomment something, not to mention even where to find what I am trying to uncomment. Thanks.
Some of the stuff is a little outdated for this, but I got it working earlier today. Being new to the programming is not an issue. Just find the individual steps, then search online for how to do them.
For example, code for installing Brew no longer works unless you have Xcode, but Xcode doesn’t work for OSX unless it’s Xcode 6.4 and things like that.
Break down the individual parts, search online how to do each of those, and you’ll fly through this.
Main problem people get is they have different ports for Mac and RPi, but still expect them to work.
You just have to add the line below all of that, and then save 🙂 Don’t really have to uncomment anything
Hello I have a problem in my mac the video is inversed, the picture and the text is inversed, I want to read text with the camera what can I do for this ?
Thanks for reply
I am having a problem on the Mac side:
brew install gstreamer gst-libav gst-plugins-ugly gst-plugins-base gst-plugins-bad gst-plugins-good
Error: You must `brew link libpng` before gstreamer can be installed
Error: You must `brew link libpng xz` before gst-libav can be installed
Error: You must `brew link libpng jpeg` before gst-plugins-ugly can be installed
Error: You must `brew link libpng` before gst-plugins-base can be installed
Error: You must `brew link libpng jpeg` before gst-plugins-bad can be installed
Error: You must `brew link libpng jpeg` before gst-plugins-good can be installed
Is there any way to do this using a camera that is hooked up over USB?
I’m wondering how to show gst-launch’s video to the web page in our server instead of calling the gst from the client.
or maybe we can show it in video player from the client?
Getting this massage:
** (gst-launch-1.0:655): CRITICAL **: gst_gl_window_get_context: assertion ‘GST_IS_GL_WINDOW (window)’ failed
Caught SIGSEGV
exec gdb failed: No such file or directory
Spinning. Please run ‘gdb gst-launch-1.0 655’ to continue debugging, Ctrl-C to quit, or Ctrl-\ to dump core.
couldn’t find a solution on the web, any ideas? thanks
My team has had a lot of success with this, but there’s one thing we’d like to change. We’re sending the stream over a wireless Ubiquiti connection, and when the connection drops for a second, it freezes, followed by a sped up playback of what happened in the downtime until it catches up to real time.
Is there a way to drop this so that we are seeing real time as soon as we get connection back?
I added this deb http://vontaene.de/raspbian-updates/ . main. Unfortunately, doesn’t with Debian Stretch ver 9.