Saturday, May 22, 2010

use ffmpeg to setup streaming server on android

ffmpeg is a powerful media library. It provides ffserver tool that can be used to setup a streaming server.
Here is how to compile ffmpeg for android, using CodeSourcery's cross compiler.

1. Download and extract ffmpeg source code.
2. Use below commands to compile ffmpeg
./configure --arch=arm --cross-prefix=arm-none-linux-gnueabi- --extra-ldflags=-static --target-os=linux
make
3. Run file ffserver && readelf ffserver -d  or  arm-none-linux-gnueabi-objdump ffserver -x | grep NEEDED commands  to make sure ffserver is statically linked
4. Transfer ffserver tool, ffserver.conf file (defines what media files will be served by ffserver) and media files to android with adb push command
5. Start streaming server with ./ffserver -f ffserver.conf on andoird shell

Below is a sample ffserver.conf file, which tells the ffserver to listen on rtsp port 7654. It defines two media files for streaming, /data/1.mp3 and /data/1.mp4, respectively. So make sure these files exist.


# Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 8090

# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0

# Port on which the server is listening. You must select a different

# port from your standard HTTP web server if it is running on the same

# computer.

RTSPPort 7654


# Address on which the server is bound. Only useful if you have

# several network interfaces.

RTSPBindAddress 0.0.0.0


# Number of simultaneous requests that can be handled. Since FFServer

# is very fast, it is more likely that you will want to leave this high

# and use MaxBandwidth, below.

MaxClients 1000


# This the maximum amount of kbit/sec that you are prepared to

# consume when streaming to clients.

MaxBandwidth 1000


# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -

# Suppress that if you want to launch ffserver as a daemon.
NoDaemon

<Stream 1.mp4>
Format rtp
File "/data/1.mp4"
</Stream>

<Stream 1.mp3>
Format rtp
File "/data/1.mp3"
</Stream>

To test ffserver, we can start a media player that supports media streaming, and open the url: rtsp://{ip address of the android device}:7654/1.mp3.

49 comments:

Yadnesh said...

I am a bit confused:

Is this built done using Android NDK or Android source is needed?

I am trying to build FFMpeg for Android and getting many errors.

All suggestions welcome:)

rxwen said...

I used a third party cross compiler, whose usage is shown in this post.
I didn't use android ndk, or the building system of android source code.

Yadnesh said...

Thanks :) this worked

I am still not able to use the ffserver to stream. But that is nothing to do with working on Android. Its just that not able to create right config.

nayan said...

This is good stuff.Using this build, can we do streaming using ffmpeg on android emulator?

I you have any idea,please suggest me..

Thanks,
NBK

rxwen said...

I didn't do this on emulator myself. But I believe it's feasible, if you can connect to the emulator via network with your player.

nayan said...

Hi to all,

I have builded ffmpeg on android and I can transcode ffmpeg exe on adb shell.Now,I want take Live IP camera stream(which is in LAN) as an input to ffmpeg.
When I tried to ping network IP camera on android shell it shows me 100% packet loss?How can I do access network IP camera on android?

Thanks,
NBK

rxwen said...

By default, the emulator can't work as server side to accept connections from application in your host machine.
To enable tihs, you need to use adb forward command to map a port inside emulator to a port on host machine. After the mapping has been established, your application can communicate with emulator's app via the mapped port. Please take a look at this: http://rxwen.blogspot.com/2009/11/adb-for-remote-connections.html

But as far as I know, adb command can only map tcp port or nuix domain socket. So it may require some modifications on adb tool to make it support udp port, which is widely used in media streaming.

Smet said...

"Must specify target arch and OS when cross-compiling"

is what I get when running:

"./configure --arch=arm --cross-prefix=arm-none-linux-gnueabi- --extra-ldflags=-static"

.... why?

rxwen said...

try this:

./configure --arch=arm --cross-prefix=arm-none-linux-gnueabi- --extra-ldflags=-static --target-os=linux

to specify target-os explicitly.

Jay-J said...

Hi,

Great post, although a bit sparse on the details :). For example, the ffmpeg doc page mentions that ffserver needs video streamed from an ffmpeg instance. Just wanted to ask if this is indeed the case. It basically says give the following command:

ffserver -f ffserver.conf &
ffmpeg -i INPUTFILE http://localhost:port/test.ffm


where INPUTFILE is the file you want to stream

Are you saying that ffmpeg is unnecessary and the required files can be specified in the .conf?

Secondly (and most importantly), I'm trying to stream video from my cellphone's camera (Nexus One). I tried using -i /dev/msm_camera and -i /dev/pmem_camera as inputfiles for ffmpeg, but I got permission denied for the second one and cannot perform for the first one. Is this the right way to do it? Thanks in advance

rxwen said...

Hi,

Yes, I also find some of posts lack some details. Even myself found some details missing when I viewed a post that was posted long ago.

For your first question. Based on my test, it's not always necessary to use ffmpeg serve as the source, a local file specified in configuration file is also ok.

For the second one. I guess you need to 'root' your nexus one. Try google "root nexus one". But if you do this, you may lose your warranty. Even worse, there is risk "bricking" it. Hope you're lucky. :)

Jay-J said...

Hi, thanks for the quick reply.

Will rooting guarantee access to the camera device from the command line? I don't want to root it unless guaranteed to work...

rxwen said...

I'm not able to guarantee this, for I can't test it myself. I just suspected that it's caused by the account doesn't have enough privilege.

And I'm curious why you would like to manipulate the device file directly? Why not make use of upper layer APIs?

Jay-J said...

I'm trying to write an application to stream live video from the Android phone camera. Cross compiling ffmpeg library allows it to work on Android phone. If ffmpeg can take the Android camera as input, that can be sent to ffserver and streamed

rxwen said...

Yes, I understand your intention. I just think it will be more portable to make use of classes like MediaRecorder provided in java layer. More importantly, you may get around the permission issue by using java api.

Clement said...

@rxwen: I'm doing something similar to what Jay-J is attempting to do. But the mediaRecorder API can only record to a file (and not to memory arrays). And as far as I understand, the file is only playable when the recording is finalised and the mediaRecorder stopped.

This way, would not be able to get seamless video output on the client Android while recording is still being done on the server Android, right?

rxwen said...

Clement,

I didn't notice MediaRecorder's limitation.
android.hardware.Camera should be more suitable for capturing data to memory.

rxwen said...

Clement,

MediaRecorder can output to a fd, which can be the fd to a socket.

Clement said...

Hi rxwen,

Thanks for your suggestion.

It does sound promising, but I've not been able to get a FD from a socket.

From what I understand, it is only possible to call getFD from a socketImpl, but socketImpl is an abstract class and cannot be instantiated.

Have you managed to perform what you have described to me? Would very much appreciate if you could point me to some resources regarding the socketImpl classes if so, since Googling it doesn't really help much.

Cheers!

rxwen said...

Clement,

May be you can try using reflection to get the fd, e.g.:

public static int getSocketFD(Socket socket) throws ClassNotFoundException, IllegalArgumentException, IllegalAccessException {
Field[] fs = Class.forName("java.net.Socket").getDeclaredFields();
FileDescriptor fd = null;
SocketImpl impl = null;
for (Field f : fs) {
f.setAccessible(true);
if (f.getName().equals("impl")) {
impl = ((SocketImpl)f.get(socket));
break; } }
if (impl == null) return -1;
fs = Class.forName("java.net.SocketImpl").getDeclaredFields();
for (Field f : fs) {
f.setAccessible(true);
if (f.getName().equals("fd")) {
fd = (FileDescriptor)f.get(impl);
break; } }
if (fd == null) return -1;
fs = Class.forName("java.io.FileDescriptor").getDeclaredFields();
for (Field f : fs) {
f.setAccessible(true); if (f.getName().equals("fd")) return f.getInt(fd);
} return -1; }

Personally, I'd prefer the way of using android.hardware.Camera to capture bitmap and transfer it via rtp.


I didn't have a chance to try it myself yet. So, can you let us know if these two ways work after you tried them? Really appreciate it.

Clement said...

Hi rxwen,

Thanks for your suggestions.

Did not try the first method (socketFD) because I realised that it would again run into the problem of not being able to watch the video until the recording is finalised on the sender side. I know this because I have tried extracting parts of the media file when it is still being recorded to, and it cannot be played.

For the second method, I found that it is rather difficult to extract the bitmap from the camera. Please refer to http://code.google.com/p/android/issues/detail?id=823 for more details about this.

Would appreciate any new suggestions you have, but for now I'm going to continue working on the method as per the discussion in the given link.

Christina Loop said...

Hello rxwen,

Thank you very much for your helpful post.
There is something I would like to ask you. I have compiled ffmpeg with the help of the cross compiler you are suggesting. So, I've put the tools ffmpeg and ffserver inside my android mobile. The error I get is: permission denied. Do you know what the problem is? And in which way should I run ffmpeg and ffserver? With ./ffserver -f etc, like you suggest, or ffserver -f etc? Either way I get a permission denied...

Thank you in advance for your help,
Christina

Jay-J said...

@Christina: How are you putting them on the phone and running them? Android command line? There are only certain directories where you can copy and run these things; you cannot run native apps from the SD card for instance and you need to use the command line (can't remember which command but you can look it up), to load the files onto the phone and then run them

rxwen said...

Christina,

You also need to run "chmod 777 executable_name" to assign executing privilege.

Christina said...

Thank you very much for your suggestions! It was needed to put it in a folder out of sdcard and do chmod. I have a rooted phone, and I am trying to take video from the camera to do live streaming. If I succeed in it, I will inform you!

Take care,
Christina

biolizard89 said...

Any idea if it's possible to cross-compile ffmpeg with libx264 support for Android? If so, any chance you could enlighten me on how? On that note, if one uses the fastest possible x264 preset (I think "ultrafast"), any idea if the Android phone can handle real-time encoding?

Christina, any luck with the camera?

Thanks.

Natrix said...

I've been experimenting streaming from my Galaxy I9000 (froyo) all the day with ffmpeg, I tried chowning both ffmpeg,/dev/video0 also chmod for /dev/video0,
it seems like there is some security limit that reboots the device every time I try to access the camera, if someone can tell what to do, or how to log the android to know what causes the reboot, please inform us. because I believe it's the last obstacle between me and live streaming

Jay-J said...

Is your phone rooted Natrix? When I tried what you did (chmod etc) from an unrooted Moto Droid and a Nexus One, it didn't reboot, but simply gave an error and didn't work. I always thought rooting would be the thing to make it work but didn't want to root the phones I had...

Natrix said...

Well, my phone is rooted and I have busybox installed.


Regards

loop said...

Hello everyone,

I had the same problem with /dev/video0. The only thing that worked for me was recording with my app, saving the video file to the folder of the application (and not the sdcard) and running the command of ffmpeg from there (with adb shell):

./ffmpeg -f h263 -s320x240 -re -i file -sameq -vcodec mpeg2video -re -f mpeg2video rtp://"Your pc ip"

Then you will be able to see the video to a vlc player...

The problem with this method is that the streaming stops if the mobile phone has a good processor, because the format we put is h263 and it is of course wrong, since the format that the mobile phone saves the video is either mpeg4 or 3gp. Although, it is the only format that works since those two formats put the moov atom at the end of the file and ffmpeg cannot stream them since it cannot find it. That is why i have put h263 in order to make it think that there is no moov atom. And it works....but not on all mobile phones. And it also needs some time aproximately 8seconds to my phone, otherwise the streaming will stop immediately.

What is more, ffmpeg in android does not seem to obey when I change the bitrate or the framerate...

I have tried different ways in order to make it work. Either with a pipe or with cat, but everything has the same concusion.

I think only from inside the program this can work correctly. That is why all the known applications use ffmpeg libraries and stream the file through a buffer.

If someone has some crazy idea, with which this might work, I would gladly be willing to use it, since I am not very happy about searching one by one the libraries in ffmpeg...

Best Regards,
Christina

Jachym said...

Is it possible to get ffmpeg binary, already compiled to run in terminal? I really need that one :)

rxwen said...

see if this works:
http://code.google.com/p/rxwen-blog-stuff/downloads/detail?name=ffserver&can=2&q=#makechanges

Henrique said...

Has anyone really performed video streaming from Android yet?

Anonymous said...

Hi,

Is it possible to stream camera data live using ffserver ?

Camera -> ffserver -----> PC with VLC
Regards
Junky

Anonymous said...

Stream from laptop using ffserver -> yes. Just like VLC. There are instructions online, it's real easy.

Stream from Android device using ffserver cross-compiled to ARM platform -> mostly no. Camera device access isn't allowed.

Anonymous said...

Hi,

Is it possible to stream android camera data over ffserver so that i can
receive it @ VLC connected in the same network.

Is it possible to pass raw buffer (YUV420) to ffserver which i get from
android camera. I dont have access to video4linux2. Please let me know a configuration where in i can stream on the fly data i am getting from the android camera data.

What are the steps i need to accomplish this task. Please suggest.

Junky

Anonymous said...

Hi,

Is it possible to stream android camera data over ffserver so that i can
receive it @ VLC connected in the same network.

Is it possible to pass raw buffer (YUV420) to ffserver which i get from
android camera. I dont have access to video4linux2. Please let me know a configuration where in i can stream on the fly data i am getting from the android camera data.

What are the steps i need to accomplish this task. Please suggest.

Junky

Saud said...
This comment has been removed by the author.
Saud said...
This comment has been removed by the author.
Saud said...
This comment has been removed by the author.
Saud said...

How can i use ffserver in my android application .. i dont want to run ffserver from android shell i want to run it from my android application .. behind an android button .. is there any way i can access ffserver from android code and give the directory list of my videos in some functions ..

Saud said...
This comment has been removed by the author.
rxwen said...

You can use ndk to compile ffserver as a library (remove main function), and start the ffserver from java code through jni.

Saud said...

my ffserver using your method has started but i can't play the stream using android media player or any other media player ..
i'm connecting my android phone to my android emulator i have done port fowarding (using a proxy server) and everything..below is the output im getting at ffserver:

C:\android-sdk-windows\platform-tools>adb shell
# cd data
cd data
# ./ffserver -f ffserver.conf
./ffserver -f ffserver.conf
ffserver version 0.8, Copyright (c) 2000-2011 the FFmpeg developers
built on Jun 23 2011 09:52:28 with gcc 4.5.2
configuration: --arch=arm --cross-prefix=arm-none-linux-gnueabi- --extra-ldfla
gs=-static --target-os=linux
libavutil 51. 9. 1 / 51. 9. 1
libavcodec 53. 7. 0 / 53. 7. 0
libavformat 53. 4. 0 / 53. 4. 0
libavdevice 53. 1. 1 / 53. 1. 1
libavfilter 2. 23. 0 / 2. 23. 0
libswscale 2. 0. 0 / 2. 0. 0
Sat Mar 10 18:22:59 2012 Opening file '/sdcard/Video/rvp.flv'
Sat Mar 10 18:23:01 2012 [flv @ 0xe68c40]Estimating duration from bitrate, this
may be inaccurate
Sat Mar 10 18:23:01 2012 Opening file '/sdcard/Video/video1.3gp'
Sat Mar 10 18:23:02 2012 FFserver started.
Sat Mar 10 18:23:36 2012 10.0.2.2 - - [DESCRIBE] "rtsp://192.168.1.111:5000/vide
o1.3gp RTSP/1.0" 200 167
Sat Mar 10 18:23:41 2012 10.0.2.2 - - [DESCRIBE] "rtsp://192.168.1.111:5000/vide
o1.3gp RTSP/1.0" 200 167
Sat Mar 10 18:24:53 2012 10.0.2.2 - - [DESCRIBE] "rtsp://192.168.1.111:5000/vide
o1.3gp RTSP/1.0" 200 167
Sat Mar 10 18:26:20 2012 10.0.2.2 - - [DESCRIBE] "rtsp://192.168.1.111:5000/vide
o1.3gp RTSP/1.0" 200 167

and at android media player im getting the error that this video can't be played .. please help me ..

Saud said...

okay i resolved this issue by changing the config file file i'am not able to play .flv files in android player how can i stream a encoded stream as android supports .264 ..

Saud said...

okay i resolved this issue by changing the config file file i'am not able to play .flv files in android player how can i stream a encoded stream as android supports .264 ..

restoreTemple said...

Hi - i am getting this error sh: ./ffmpeg not executable : magic 7F45.

when i am trying to run this on my android-x86.

I had compiled FFmpeg with the sorcery tool chain using ./configure --arch=arm --cross-prefix=arm-none-linux-gnueabi- --extra-ldflags=-static --target-os=linux

Any help will be very appreciated.

Thanks
RC

rx wen said...

I didn't try android-x86 before. But I think you need a toolchain targeting the x86 platform instead.

Bill said...

Hi rxwen,

Can you give me some hint about how ti use ndk to compile ffserver as a library?

Thanks.