Stuff longer than 140 characters I find important to save in the internets.

Free Live Video Streaming with HTTP Live Streaming, uStream and in a GNU Linux Environment

Well, that was a long Headline. But it’s not even long enough. It should read:

Free Live Video Streaming with HTTP Live Streaming, uStream,, ffmpeg, vlc, x264, Wowza, CamTwist and Flash Media Encoder in a GNU Linux environment and Mac OS-X.

but that would really have been to long. Anyway, now I’ve written it and hopefully you came here from google searching for exactly this ;)


For our 200th anniversary show of Bits und so we wanted to provide a live video stream. Fortunately all the camera and editing was done by and BoinxTV.

The “only” existing challenge was to get the videostream from a local Mac Pro to the Intertubes without buying or renting any streaming software or service. Besides a Mac, which was the source for the stream, we also have a rented Linux server which already provides HTTP Live Streaming for our weekly Audio Podcast.

The first thing that comes to ones mind is ustream, or any other of the “free” online streaming platforms. You can easily upload your stream to those using Adobe’s Flash Media Live Encoder (available for Mac/Windows) or even Quicktime Broadcaster (in the case of

But as always “free” (as in cheap) has it’s limits. For only 1000 viewers can watch from a single country. Ustream has some other limitation I can’t remeber right now.

Also we wanted to provide a HTTP Live Stream since it’s the future and Flash needs to die die die … ;-)

The goal can be summarized like this: Each of the following steps is described below. The final setup looks like this: Architecture Overview

Get a videostream from a Mac to Linux

It sounds rather simple. You have a Video and want to stream it. Should be possible in the year 2010. The source for us was CamTwist. A free software that provides you with a virtual VideoDevice that can be the input for broadcasting applications.

My very first idea was to simply use Quicktime Broadcaster and Darwin Streaming Server. Where Quicktime Broadcaster is the application capturing video on the mac and sending it to the server using RTP(RTSP). Unfortunatly Quicktime Broadcaster is total crap. It crashes, it doesn’t let you change the FPS and produces a stream which isn’t very compatible with the opensource world (namely ffmpeg). After some lost hours I decided to let it go.

A second possibilty is VLC. The latest version for OS-X now can capture the desktop or parts of it and stream it using all the available protocols within vlc. After being happy that I found a working setup I noticed that something was missing. Audio. Gna! But that’s okay because all the crashes were beginning to drive me nuts.

The third and very popular solution to broadcast is Adobe’s Flash Media Live Encoder. Despite beeing from Adobe it simply does what it promises. Capture video, encode it in “realtime” and send it to the world. The only problem with it: You need to have a Flash Media Server (rtmp).

A short look to wikipedia reveals that several alternatives exist to Adobe’s own implementation which is very expensive. Even some opensource variants. I took a quick look at Reals Helix and Red5 but ended up using Wowza. Helix didn’t quite like me (Can’t remeber the exact problem anymore) and for Red5 it seemed to me that I would have to program something in java.

Wowza offers streaming to clients with Microsoft Silverlight, Adobe Flash (RTMP) and RTSP (Real/Quicktime). It is a commercial product that is available under several licenses. One of them is a so called Developer License. The limitations are: only 10 simultaneous connections and a time limit on HTTP Live Streaming. Since this license is free it’s just the right solution. 10 Clients is more than enough since only the server itself will be a client. Endusers don’t connect to Wowza directly. The installation is simple and described pretty well already. I followed these instructions. The configuration for RTMP Streaming is described by Wowza in the Quickstart Guide.

Goal 1 achieved. Wowza Server receives a videostream from a Mac and offers it to clients via RTMP or RTSP. The source for HTTP-Live-Streaming, and Ustream is found.

HTTP Live Streaming Video

HTTP Live Streaming is a Protocol Draft by Apple currently supported by iOS Devices, QuickTime-X and Safari on OS-X. In short, the client loads a playlist from the server.
The content of this playlist are short clips (e.g. 10 seconds) residing on a webserver. So for the server one needs to create and maintain a playlist file and the short segments of the stream.

The segments need to be a MPEG Transport Stream. I already blogged about this challenge a while ago. You may want to read this first and then come here for updated installation instructions.
In short: Carson McDonalds has created an opensource solution that takes ffmpeg compatible audio/video streams or files and creates a HTTP Live Stream from it. It’s two parts. A binary that splits the stream into short segments and a ruby script that does all the other stuff like uploading segments to a webserver and creating the playlist files.

Here are updated installation and configuration instructions for Debian Lenny 64bit. I recommend removing old versions of ffmpeg, x264 and so on. The following instructions assume a vanilla installation of Lenny.

Installing FFmpeg

Building ffmpeg is easy if you know how. The version described here contains several 3rd party libraries like x264, theora/vorbis and librtmp to ensure flexible usage. Since x264 requires libavformat (part of ffmpeg) we’re going to compile ffmpeg twice and an old version of ffmpeg which is required for the live segmenter to work correctly.

You may perform all the following steps as root (which isn’t advised) or like me using sudo whenever required.

Start by installing several required packages:

sudo apt-get install \
build-essential \
subversion \
git-core \
yasm \
pkg-config \
libssl-dev \
libbz2-dev \
ruby \
Checkout latest sources of ffmpeg and all the external libraries
svn checkout svn:// ffmpeg
git clone git://

# Download libvorbis, libogg and libtheora

# Checkout rtmpdump (for rtmp protocol support)
svn co svn:// rtmpdump

# Download libfaac sources (AAC Support)

# Download lame (MP3 Support)
It is necessary to build ffmpeg before x264. So let’s do this. Without any external libraries. It’s rather simple since we don’t include any external libraries right now.
cd ffmpeg
sudo make install
Since we now have libavformat installed we can compile x264 and all the other external libraries. Compiling x264
cd ~/x264
./configure --enable-shared
sudo make install
Compiling libogg, libvorbis and libtheora
cd ~
tar -xvzf libvorbis-1.2.3.tar.gz
tar -xvzf libogg-1.1.4.tar.gz 
tar -xvjf libtheora-1.1.1.tar.bz2 

cd ~/libogg-1.1.4
sudo make install

cd ~/libvorbis-1.2.3
sudo make install

cd ~/libtheora-1.1.1
sudo make install
Compile and install librtmp
cd ~/rtmpdump/
sudo make install
Compile and install libfaac
cd ~
tar -xvjf faac-1.28.tar.bz2
cd faac-1.28
sudo make install
Compile and install lame (Yes some people still want MP3 instead of aac)
cd ~
tar -xvzf lame-3.98.4.tar.gz 
cd lame-3.98.4
sudo make install
Since now all required libraries are installed it’s time to compile the ffmpeg that’s actually used later on.
cd ~/ffmpeg
make clean # We built it before. Remeber? Let's remove it
./configure --enable-gpl --enable-nonfree \
    --enable-libmp3lame --enable-libfaac \
    --enable-librtmp --enable-libtheora \
    --enable-libvorbis --enable-libx264 \
    --enable-shared --enable-postproc
sudo make install
sudo /sbin/ldconfig #update cache

The system now has a nice ffmpeg installation with support for all required protocols and codecs. Let’s move on and install Carson McDonalds HTTP Live Streaming solution.

Installing Carson’s HTTP Live Streaming solution

Get the latest source from github
cd ~
git clone git://

segmenter.c (the binary source) uses libavformat (from ffmpeg) to read the stream and split it. Unfortunatly current releases of ffmpeg contain a bug preventing segmenter to work. It will just sit there and do nothing. So one needs to use an old version of ffmpeg for the compilation of segmenter.c.

I had success with the snapshot from 2009 I already linked to in my old article.

cd ~
tar -xvjf ffmpeg-export-snapshot-2009-12-02.tar.bz2
The old version is compiled and then installed to /tmp or any other path you’d like. Just make sure you’re keeping it seperate.
cd ~/ffmpeg-export-2009-12-01/
./configure --prefix=/tmp/old_ffmpeg
sudo make install
cd ~/HTTP-Live-Video-Stream-Segmenter-and-Distributor

# The following lines are 1 command!
gcc -v -Wall -g live_segmenter.c -o live_segmenter \
    -lavformat -lavcodec -lavutil -lvorbis -ltheora\
    -lbz2 -lm -lz -lfaac -lmp3lame \
    -lx264 \
    -I/tmp/old_ffmpeg/include \
After that you should have a live_segmenter binary in the current folder. To make it available copy it to /usr/local/bin. I also like to have the ruby script in /usr/local/bin.
sudo cp live_segmenter /usr/local/bin/
sudo cp http_streamer.rb /usr/local/bin/
sudo mkdir -p /usr/local/lib/site_ruby/
sudo cp hs_* /usr/local/lib/site_ruby/
That’s it. To check the installation below are some commands with the expected output.
kyrios@segmenter:~$ pwd
kyrios@segmenter:~$ ffmpeg
FFmpeg version SVN-r25569, Copyright (c) 2000-2010 the FFmpeg developers
  built on Oct 25 2010 19:32:08 with gcc 4.3.2
  configuration: --enable-gpl --enable-nonfree --enable-libmp3lame --enable-libfaac --enable-librtmp --enable-libtheora --enable-libvorbis --enable-libx264 --enable-shared --enable-postproc
  libavutil     50.32. 3 / 50.32. 3
  libavcore      0. 9. 1 /  0. 9. 1
  libavcodec    52.93. 0 / 52.93. 0
  libavformat   52.84. 0 / 52.84. 0
  libavdevice   52. 2. 2 / 52. 2. 2
  libavfilter    1.53. 0 /  1.53. 0
  libswscale     0.12. 0 /  0.12. 0
  libpostproc   51. 2. 0 / 51. 2. 0
Hyper fast Audio and Video encoder
usage: ffmpeg [options] [[infile options] -i infile]... {[outfile options] outfile}...
Use -h to get full help or, even better, run 'man ffmpeg'

kyrios@segmenter:~$ live_segmenter 
Usage: live_segmenter segment <length> <output location> <filename prefix> 

kyrios@segmenter:~$ http_streamer.rb 
Usage: http_streamer.rb <config file>


The ruby script http_streamer.rb reads it’s configuration from .yml files. Below I commented some options from our configuration file. There are also some good (full) examples in the folder “example-configs” inside the HTTP-Live-Video-Stream-Segmenter-and-Distributor folder.
temp_dir: '/tmp'
The path where live_segmenter writes files before the ruby script moves them.
segment_prefix: '1'
Can be anything you like. In case you’re using Cloudfront or any other caching webspace you may want to change it when restarting the segmenter.
index_prefix: 'livebus'
Anything you like.
input_location: 'rtmp://localhost:1935/live/myStream.sdp'
That is the URL to the Wowza LiveStream. It can also be a file or any other input that ffmpeg understands.
Apple recommends and uses 10 seconds. Longer segments lead to longer delay. Short segments lead to more hits on your webserver.
url_prefix: ''
This is the prefix for our S3 Bucket transforming it to a cloudfront request. If you’re not using Cloudfront/S3 you will most likely leave this empty.
source_command: 'ffmpeg -re -er 4 -y -i %s -vcodec libx264 -vpre medium -crf 20 -s 640x360 -acodec copy -r 30  -f mpegts -'
In theory the source command should just change the mux from flv or anything else to mpeg-ts. However I had problems with that so here is a (in theory) unecessary reencode.
  • -re Read input at native framerate. Really important for input files (non streams)
  • -vpre medium A x264 preset loading some defaults.
  • -crf 20 The desired quality. This is very good quality (1=best 50=worst). The resulting file is big. But the “file” is just a pipe here. So it doesn’t matter and we don’t want to lose to much quality during this encode
  • -r 30  iOS devices had jitter in the video with 15 FPS. So we encoded everything in 30.
  ffmpeg_command: "ffmpeg -er 4 -y -i %s -re -f mpegts -acodec libfaac -ac 1 -ar 48000 -ab 40k -s 400x224 -vcodec libx264 -b 96k -flags +loop -cmp +chroma -partitions +parti4x4+partb8x8+partp8x8+partp4x4 -subq 7 -trellis 1 -bf 6  -refs 6 -flags2 +dct8x8 -flags2 +wpred -me_range 16 -keyint_min 30 -sc_threshold 40 -i_qfactor 0.71 -bt 96k -maxrate 96k -bufsize 96k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 16:9 -g 250 -r 30 -async 2 - | %s %s %s %s %s"
  bandwidth: 136000

  ffmpeg_command: "ffmpeg -er 4 -y -i %s -f mpegts -acodec libfaac -ar 48000 -ab 40k -s 400x224 -vcodec libx264 -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 5 -trellis 1 -refs 1 -coder 1 -me_range 16 -bf 6 -keyint_min 30 -sc_threshold 40 -i_qfactor 0.71 -crf 22 -maxrate 280k -bufsize 280k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 16:9 -g 250 -r 30 -async 2 - | %s %s %s %s %s"
  bandwidth: 320000

  ffmpeg_command: "ffmpeg -er 4 -y -i %s -f mpegts -acodec copy -s 640x360 -vcodec libx264 -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 5 -trellis 1 -refs 1 -coder 1 -me_range 16 -keyint_min 30 -bf 6 -sc_threshold 40 -i_qfactor 0.71 -crf 22 -maxrate 500k -bufsize 500k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 16:9 -g 250 -r 30 -async 2 - | %s %s %s %s %s"
  bandwidth: 544000

  ffmpeg_command: "ffmpeg -er 4 -y -i %s -f mpegts -vn -acodec libfaac -ac 1 -ar 44100 -ab 64k - | %s %s %s %s %s"
  bandwidth: 64000

  ffmpeg_command: "ffmpeg -er 4 -y -i %s -f mpegts -vn -acodec libfaac -ac 1 -ar 44100 -ab 32k - | %s %s %s %s %s"
  bandwidth: 32000
Long long long hours of testing. This is the result. On an iPhone 4 and iPad the streams played fine also they shouldn’t support B-Frames. Maybe these settings don’t work with older iOS devices. For all other options refer to the examples provided by Carson. This is the essence. Goal 2 achieved. HTTP Live Streaming. is a free live streaming platform. After registering one may broadcast to them using FlashMediaEncoder or Quicktime Broadcaster. They also provide a python script that takes input from VLC and streams it to them.

Since the VLC version that comes with debian is rather old (surprised?) and doesn’t support rtsp we need to build it. Live555 provides a rtsp library.

Download Live555 Media (rtsp protocol support)
Install live555 (rtsp library)
cd ~
tar -xvzf live555-latest.tar.gz 
cd live
./genMakefiles linux-64bit
Get the latest VLC sources
cd ~
mv download vlc-1.1.4.tar.bz2
tar -xvjf vlc-1.1.4.tar.bz2
Compile and install VLC with RTSP support
sudo apt-get install libpostproc-dev
cd ~/vlc-1.1.4
./configure --enable-v4l --enable-v4l2 --disable-nls \
--disable-mozilla --disable-dbus --disable-dbus-control \
--disable-telepathy --disable-lua --disable-cdda \
--disable-vcd --disable-dvb -disable-libcddb \
--disable-x11 --disable-glx --disable-opengl \
--disable-xvideo --disable-xvmc --disable-freetype \
--disable-fontconfig --disable-fb --disable-qt4 \
--disable-skins2 --disable-sdl --disable-sdl-image \
--disable-notify --disable-libgcrypt --disable-mad \
--enable-avcodec --enable-avformat --disable-a52 \
--with-live555-tree=$HOME/live/ --disable-xcb \

sudo make install
cd ~
git clone git://
To start streaming to justin perform the following steps:
  1. Start VLC with local rtp output
  2. Start to read rtp input from vlc and stream it to
/usr/local/bin/vlc -vv rtsp://localhost:1935/live/myStream.sdp --sout='#rtp{dst=,port=1234,sdp=file:///tmp/vlc.sdp}'
cd ~/jtvlc
python bus200 live_1492088_gdsNgILmX82ILdsqlgpiFodLQesdk4 /tmp/vlc.sdp
Replace bus200 with your login name and “live_14…” with your stream_key. The stream should start immediatly.


Ustream is another free live streaming platform. Once you know how, it’s easy to stream to ustream using linux (ffmpeg). Here is the command:
ffmpeg -v 3 -i rtmp://localhost:1935/live/myStream.sdp -acodec copy -vcodec copy -f flv 'rtmp:// flashver=FMLE/3.0\20(compatible;\20FMSc/1.0)'

rtmp://localhost:1935/live/myStream.sdp is our local Wowza server.

rtmp:// is the streaming destination. You can get this by downloading a Adobe Flash Media Encoder configuration file from ustream. Append the channel name which can also be found in the XML (/rykWdC3q2KV2g1mRkto1vhpXoffair) to the URL.

When you log in to ustream and hit broadcast you should see the livestream. Yes, that’s necessary. It’s not enough to send the data to them. You need to manually start the broadcast using their website.

Random notes

  • What I still don’t like is all the reencoding, remuxing and switch of protocols. It’s amazing that all this works in the end. But it’s horribly complicated.
  • Also I like how reliable Wowza is I don’t like it in the setup because it’s not OpenSource.

If you have any question or recommendations please leave a comment.

blog comments powered by Disqus