EzDevInfo.com

streaming interview questions

Top streaming frequently asked interview questions

UISlider with ProgressView combined

Is there an apple-house-made way to get a UISlider with a ProgressView. This is used by many streaming applications e.g. native quicktimeplayer or youtube. (Just to be sure: i'm only in the visualization interested)

Slider  with loader

cheers Simon


Source: (StackOverflow)

Upload live streaming video from iPhone like Ustream or Qik

How to live stream videos from iPhone to server like Ustream or Qik? I know there's something called Http Live Streaming from Apple, but most resources I found only talks about streaming videos from server to iPhone.

Is Apple's Http Living Streaming something I should use? Or something else? Thanks.


Source: (StackOverflow)

Advertisements

Stream large binary files with urllib2 to file

I use the following code to stream large files from the Internet into a local file:

fp = open(file, 'wb')
req = urllib2.urlopen(url)
for line in req:
    fp.write(line)
fp.close()

This works but it downloads quite slowly. Is there a faster way? (The files are large so I don't want to keep them in memory.)


Source: (StackOverflow)

Streaming video from Android camera to server

I've seen plenty of info about how to stream video from the server to an android device, but not much about the other way, ala Qik. Could someone point me in the right direction here, or give me some advice on how to approach this?


Source: (StackOverflow)

Best approach to real time http streaming to HTML5 video client

I'm really stuck trying to understand the best way to stream real time output of ffmpeg to a HTML5 client using node.js, as there are a number of variables at play and I don't have a lot of experience in this space, having spent many hours trying different combinations.

My use case is:

1) IP video camera RTSP H.264 stream is picked up by FFMPEG and remuxed into a mp4 container using the following FFMPEG settings in node, output to STDOUT. This is only run on the initial client connection, so that partial content requests don't try to spawn FFMPEG again.

liveFFMPEG = child_process.spawn("ffmpeg", [
                "-i", "rtsp://admin:12345@192.168.1.234:554" , "-vcodec", "copy", "-f",
                "mp4", "-reset_timestamps", "1", "-movflags", "frag_keyframe+empty_moov", 
                "-"   // output to stdout
                ],  {detached: false});

2) I use the node http server to capture the STDOUT and stream that back to the client upon a client request. When the client first connects I spawn the above FFMPEG command line then pipe the STDOUT stream to the HTTP response.

liveFFMPEG.stdout.pipe(resp);

I have also used the stream event to write the FFMPEG data to the HTTP response but makes no difference

xliveFFMPEG.stdout.on("data",function(data) {
        resp.write(data);
}

I use the following HTTP header (which is also used and working when streaming pre-recorded files)

var total = 999999999         // fake a large file
var partialstart = 0
var partialend = total - 1

if (range !== undefined) {
    var parts = range.replace(/bytes=/, "").split("-"); 
    var partialstart = parts[0]; 
    var partialend = parts[1];
} 

var start = parseInt(partialstart, 10); 
var end = partialend ? parseInt(partialend, 10) : total;   // fake a large file if no range reques 

var chunksize = (end-start)+1; 

resp.writeHead(206, {
                  'Transfer-Encoding': 'chunked'
                 , 'Content-Type': 'video/mp4'
                 , 'Content-Length': chunksize // large size to fake a file
                 , 'Accept-Ranges': 'bytes ' + start + "-" + end + "/" + total
});

3) The client has to use HTML5 video tags.

I have no problems with streaming playback (using fs.createReadStream with 206 HTTP partial content) to the HTML5 client a video file previously recorded with the above FFMPEG command line (but saved to a file instead of STDOUT), so I know the FFMPEG stream is correct, and I can even correctly see the video live streaming in VLC when connecting to the HTTP node server.

However trying to stream live from FFMPEG via node HTTP seems to be a lot harder as the client will display one frame then stop. I suspect the problem is that I am not setting up the HTTP connection to be compatible with the HTML5 video client. I have tried a variety of things like using HTTP 206 (partial content) and 200 responses, putting the data into a buffer then streaming with no luck, so I need to go back to first principles to ensure I'm setting this up the right way.

Here is my understanding of how this should work, please correct me if I'm wrong:

1) FFMPEG should be setup to fragment the output and use an empty moov (FFMPEG frag_keyframe and empty_moov mov flags). This means the client does not use the moov atom which is typically at the end of the file which isn't relevant when streaming (no end of file), but means no seeking possible which is fine for my use case.

2) Even though I use MP4 fragments and empty MOOV, I still have to use HTTP partial content, as the HTML5 player will wait until the entire stream is downloaded before playing, which with a live stream never ends so is unworkable.

3) I don't understand why piping the STDOUT stream to the HTTP response doesn't work when streaming live yet if I save to a file I can stream this file easily to HTML5 clients using similar code. Maybe it's a timing issue as it takes a second for the FFMPEG spawn to start, connect to the IP camera and send chunks to node, and the node data events are irregular as well. However the bytestream should be exactly the same as saving to a file, and HTTP should be able to cater for delays.

4) When checking the network log from the HTTP client when streaming a MP4 file created by FFMPEG from the camera, I see there are 3 client requests: A general GET request for the video, which the HTTP server returns about 40Kb, then a partial content request with a byte range for the last 10K of the file, then a final request for the bits in the middle not loaded. Maybe the HTML5 client once it receives the first response is asking for the last part of the file to load the MP4 MOOV atom? If this is the case it won't work for streaming as there is no MOOV file and no end of the file.

5) When checking the network log when trying to stream live, I get an aborted initial request with only about 200 bytes received, then a re-request again aborted with 200 bytes and a third request which is only 2K long. I don't understand why the HTML5 client would abort the request as the bytestream is exactly the same as I can successfully use when streaming from a recorded file. It also seems node isn't sending the rest of the FFMPEG stream to the client, yet I can see the FFMPEG data in the .on event routine so it is getting to the FFMPEG node HTTP server.

6) Although I think piping the STDOUT stream to the HTTP response buffer should work, do I have to build an intermediate buffer and stream that will allow the HTTP partial content client requests to properly work like it does when it (successfully) reads a file? I think this is the main reason for my problems however I'm not exactly sure in Node how to best set that up. And I don't know how to handle a client request for the data at the end of the file as there is no end of file.

7) Am I on the wrong track with trying to handle 206 partial content requests, and should this work with normal 200 HTTP responses? HTTP 200 responses works fine for VLC so I suspect the HTML5 video client will only work with partial content requests?

As I'm still learning this stuff its difficult to work through the various layers of this problem (FFMPEG, node, streaming, HTTP, HTML5 video) so any pointers will be greatly appreciated. I have spent hours researching on this site and the net, and I have not come across anyone who has been able to do real time streaming in node but I can't be the first, and I think this should be able to work (somehow!).


Source: (StackOverflow)

Streaming via RTSP or RTP in HTML5

I'm building a web app that should play back an RTSP/RTP stream from a server (http://lscube.org/projects/feng).

Does the HTML5 video/audio tag support the rtsp or rtp? If not, what would the easiest solution be? Perhaps drop down to a VLC plugin or something like that.


Source: (StackOverflow)

Play YouTube videos with MPMoviePlayerController instead of UIWebView

Hello I'm trying to stream some youTube videos using the MPMoviePlayerController but I'm having some problems. The code i'm using is pretty simple and I can play .m4v videos by passing a URL to initWithContentURL. When I launch the movie player the player comes up but just goes away after about 20 seconds. When I try it in the simulator I get an alert view that says the server is not configured correctly. Is there an argument I need to pass with the URL to get a specific type of video feed from google?

NSURL *videoURL = [NSURL URLWithString:@"http://www.youtube.com/v/HGd9qAfpZio&hl=en_US&fs=1&"];
MPMoviePlayerController *moviePlayer;
moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:videoURL];

[moviePlayer play];

I've also tried the following URL's http://www.youtube.com/watch?v=HGd9qAfpZio

I have also seen the argument &format=1 and tried to add that to the end of both of the strings but no luck.

Thanks in advance for any help!


Source: (StackOverflow)

Live-stream video from one android phone to another over WiFi

I have searched the internet for days now on how to implement a video streaming feature from an android phone to another android phone over a WiFi connection but I can't seem to find anything useful. I looked on android developers for sample code, stackoverflow, google, android blogs but nothing. All I can find are some sort of phone-to-desktop or desktop-to-phone solutions for streaming, but nothing that I can borrow in my implementation.

I need to control a robot using an arduino ADK, so I am using 2 phones, one which will be mounted on the robot and another which will receive the video stream from the robot. I am mentioning this because I am trying to achieve the smallest delay between the broadcast time and the viewing time.

I am writing 2 apps, one master app to control the robot(from the handheld phone) which will control the slave app and receive the stream, and the second slave app which will run on the robot-strapped phone, controlling the motors/actuators/streaming to master app. I can not use third party apps unfortunately. I need to integrate the video stream code into my 2 apps.

What options are there for achieving this? Also is it very hard to do because I never worked with videostreaming, tough I am doing pretty good in both Java and Android development. How should I encode/decode the stream, how do I initiate the connection, will I need to work with UDP instead of TCP/IP ? I really don't know where to start, with no sample code anywhere. I am pretty sure this can be achieved. I just can't find anything useful to get me started in the right direction.

I stumbled across spydroid but it is using VLC on a desktop so its no good for me.


Source: (StackOverflow)

What are the difference between MediaPlayer and VideoView for Android

I just wonder what are the different between two of them for streaming video?

I know VideoView can be used for streaming and what is for Mediaplayer? so far as I know is MediaPlayer can do the same thing as VideoView right?

Can anyone give me the answer?

And if I want to streaming video from the server by using RTSP to Android, what should I start up with? VideoView or MediaPlayer?

Any suggestion?


Source: (StackOverflow)

Streaming large file uploads to ASP.NET MVC

For an application I'm working on, I need to allow the user to upload very large files--i.e., potentially many gigabytes--via our website. Unfortunately, ASP.NET MVC appears to load the entire request into RAM before beginning to service it--not exactly ideal for such an application. Notably, trying to circumvent the issue via code such as the following:

if (request.Method == "POST")
{
    request.ContentLength = clientRequest.InputStream.Length;
    var rgbBody = new byte[32768];

    using (var requestStream = request.GetRequestStream())
    {
        int cbRead;
        while ((cbRead = clientRequest.InputStream.Read(rgbBody, 0, rgbBody.Length)) > 0)
        {
            fileStream.Write(rgbBody, 0, cbRead);
        }
    }
}

fails to circumvent the buffer-the-request-into-RAM mentality. Is there an easy way to work around this behavior?


Source: (StackOverflow)

How do streaming resources fit within the RESTful paradigm?

With a RESTful service you can create, read, update, and delete resources. This all works well when you're dealing with something like a database assets - but how does this translate to streaming data? (Or does it?) For instance, in the case of video, it seems silly to treat each frame as resource that I should query one at a time. Rather I would set up a socket connection and stream a series of frames. But does this break the RESTful paradigm? What if I want to be able to rewind or fast forward the stream? Is this possible within the RESTful paradigm? So: How do streaming resources fit within the RESTful paradigm?

As a matter of implementation, I am getting ready to create such a streaming data service, and I want to make sure I'm doing it the "best way". I'm sure this problem's been solved before. Can someone point me to good material?


Source: (StackOverflow)

How can I stream webcam video with C#?

I want to make a simple server application where people can connect using a browser-based client (which I will make later) to watch streaming video. And I want to use C#.

What do I need to capture video or rapid images through a webcam and send them over the network?


Source: (StackOverflow)

HLS streaming video URL Need for testing

I am developing a website using HTML5 and need sample HLS Streaming video URLs for my testing purposes. Could someone provide sample HLS Streaming URLs? My target is IE9, iPad, Chrome and Android devices.

I am fully new to media and I would like to know if we can create streaming video format as per our needs like .ogg, .mp4, .WebM.


Source: (StackOverflow)

Android - MediaPlayer Buffer Size in ICS 4.0

I'm using a socket as a proxy to the MediaPlayer so I can download and decrypt mp3 audio before writing it to the socket. This is similar to the example shown in the NPR news app however I'm using this for all Android version 2.1 - 4 atm.

NPR StreamProxy code - http://code.google.com/p/npr-android-app/source/browse/Npr/src/org/npr/android/news/StreamProxy.java

My issue is that playback is fast for 2.1 - 2.3, but in Android 4.0 ICS the MediaPlayer buffers too much data before firing the onPrepared listener.

An example amount of data written to the Socket OutputStream before onPrepared():

On SGS2 with 2.3.4 - onPrepared() after ~ 133920 bytes

On Nexus S with 4.0.4 - onPrepared() after ~ 961930 bytes

This also occurs on the Galaxy Nexus.

Weirdly the 4.0 emulator doesn't buffer as much data as 4.0 devices. Anyone experience a similar issue with the MediaPlayer on ICS?

EDIT

Here's how the proxy is writing to the socket. In this example it's from a CipherInputStream loaded from a file, but the same occurs when it's loaded from the HttpResponse.

final Socket client = (setup above)

// encrypted file input stream
final CipherInputStream inputStream = getInputStream(file);

// setup the socket output stream
final OutputStream output =  client.getOutputStream();

// Writing the header
final String httpHeader = buildHttpHeader(file.length());
final byte[] buffer = httpHeader.getBytes("UTF-8");
output.write(buffer, 0, buffer.length);

int writtenBytes = 0;
int readBytes;
final byte[] buff = new byte[1024 * 12]; // 12 KB

while (mIsRunning && (readBytes = inputStream.read(buff)) != -1) {
    output.write(buff, 0, readBytes);
    writtenBytes += readBytes;
}

output.flush();
output.close();

The HTTP Headers that are written to the MediaPlayer before the audio..

private String buildHttpHeader(final int contentLength) {
    final StringBuilder sb = new StringBuilder();

    sb.append("HTTP/1.1 200 OK\r\n");
    sb.append("Content-Length: ").append(contentLength).append("\r\n");
    sb.append("Accept-Ranges: bytes\r\n" );
    sb.append("Content-Type: audio/mpeg\r\n");
    sb.append("Connection: close\r\n" );
    sb.append("\r\n");

    return sb.toString();
}

I've looked around for alternate implementations but as I have encrypted audio and the MediaPlayer does not support InputStreams as a data source my only option (I think..) is to use a proxy such as this.

Again, this is working fairly well Android 2.1 - 2.3 but in ICS the MediaPlayer is buffering a huge amount of this data before playing.

EDIT 2 :

Further testing is showing that this is also an issue on the SGS2 once upgraded to Android 4.0.3. So it seems like the MediaPlayer's buffering implementation has changed significantly in 4.0. This is frustrating as the API provides no way to alter the behaviour.

EDIT 3 :

Android bug created. Please add comments and star there as well http://code.google.com/p/android/issues/detail?id=29870

EDIT 4 :

My playback code is fairly standard.. I have the start() call on the MediaPlayer in my onPrepared() method.

mCurrentPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
mCurrentPlayer.setDataSource(url);
mCurrentPlayer.prepareAsync();

Have tried it using just prepare() and also ajacian81's recommended way but to no avail.

I should add that recently a Google employee got back to me about my question and confirmed that the buffer size was intentionally increased in ICS (for HD content). It has been requested to the API developers to add the ability to set a buffer size on MediaPlayer.

Though I think this API change request had been around before I came along so I wouldn't advise anyone to hold their breath.


Source: (StackOverflow)