EzDevInfo.com

video-streaming interview questions

Top video-streaming frequently asked interview questions

How can I display an RTSP video stream in a web page?

I have an ip camera which provides a live RTSP video stream. I can use VLC media player to view the feed by providing it with the URL:

rtsp://cameraipaddress

But I need to display the feed on a web page. The camera provider supplied an ActiveX control which I got working, but it is really buggy and causes the browser to frequently hang.

Does anyone know of any alternative video plugins I could use which support RTSP?

The camera can be configured to stream in either H264 or MPEG4.


Source: (StackOverflow)

HTTP LIve Streaming

Ok, I have been trying to wrap my head around this http live streaming. I just do not understand it and yes I have read all the apple docs and watched the wwdc videos, but still super confused, so please help a wanna be programer out!!!

The code you write goes on the server? not in xcode? If I am right how do i set this up? Do I need to set up something special on my server? like php or something? How do use the tools that are supplied by Apple.. segmenter and such?

Please help me, Thanks


Source: (StackOverflow)

Advertisements

Live-stream video from one android phone to another over WiFi

I have searched the internet for days now on how to implement a video streaming feature from an android phone to another android phone over a WiFi connection but I can't seem to find anything useful. I looked on android developers for sample code, stackoverflow, google, android blogs but nothing. All I can find are some sort of phone-to-desktop or desktop-to-phone solutions for streaming, but nothing that I can borrow in my implementation.

I need to control a robot using an arduino ADK, so I am using 2 phones, one which will be mounted on the robot and another which will receive the video stream from the robot. I am mentioning this because I am trying to achieve the smallest delay between the broadcast time and the viewing time.

I am writing 2 apps, one master app to control the robot(from the handheld phone) which will control the slave app and receive the stream, and the second slave app which will run on the robot-strapped phone, controlling the motors/actuators/streaming to master app. I can not use third party apps unfortunately. I need to integrate the video stream code into my 2 apps.

What options are there for achieving this? Also is it very hard to do because I never worked with videostreaming, tough I am doing pretty good in both Java and Android development. How should I encode/decode the stream, how do I initiate the connection, will I need to work with UDP instead of TCP/IP ? I really don't know where to start, with no sample code anywhere. I am pretty sure this can be achieved. I just can't find anything useful to get me started in the right direction.

I stumbled across spydroid but it is using VLC on a desktop so its no good for me.


Source: (StackOverflow)

Streaming video from Android camera to server

I've seen plenty of info about how to stream video from the server to an android device, but not much about the other way, ala Qik. Could someone point me in the right direction here, or give me some advice on how to approach this?


Source: (StackOverflow)

Video streaming over websockets using JavaScript

What is the fastest way to stream live video using JavaScript? Is WebSockets over TCP a fast enough protocol to stream a video of, say, 30fps?


Source: (StackOverflow)

How to use VLC live streams with HTML5 video?

I tried HTTP Ogg/Theora and works alright with Chrome but not with Firefox 7.

VLC Configuration:

For testing, I've been streaming the desktop using the following vlc command line configuration:

vlc.exe screen:// :screen-fps=30 :screen-caching=100 :sout=#transcode{vcodec=theo,vb=800,scale=1,width=800,height=600,acodec=none}:http{mux=ogg,dst=:8181/desktop} :no-sout-rtp-sap :no-sout-standard-sap :ttl=1 :sout-keep

HTML5 video tag configuration:

<video id="video" src="http://my_host_name:8181/desktop" type="video/ogg; codecs=theora" autoplay="autoplay"/>

Any ideas?


Source: (StackOverflow)

Best way to stream files in ASP.NET

What's the best way to stream files using ASP.NET?

There appear to be various methods for this, and I'm currently using the Response.TransmitFile() method inside an http handler, which sends the file to the browser directly. This is used for various things, including sending FLV's from outside the webroot to an embedded Flash video player.

However, this doesn't seem like a reliable method. In particular, there's a strange problem with Internet Explorer (7), where the browser just hangs after a video or two are viewed. Clicking on any links, etc have no effect, and the only way to get things working again on the site is to close down the browser and re-open it.

This also occurs in other browsers, but much less frequently. Based on some basic testing, I suspect this is something to do with the way files are being streamed... perhaps the connection isn't being closed properly, or something along those lines.

After trying a few different things, I've found that the following method works for me:

Response.WriteFile(path);
Response.Flush();
Response.Close();
Response.End();

This gets around the problem mentioned above, and viewing videos no longer causes Internet Explorer to hang.

However, my understanding is that Response.WriteFile() loads the file into memory first, and given that some files being streamed could potentially be quite large, this doesn't seem like an ideal solution.

I'm interested in hearing how other developers are streaming large files in ASP.NET, and in particular, streaming FLV video files.


Source: (StackOverflow)

TCP vs UDP on video stream

I just came home from my exam in network-programming, and one of the question they asked us was "If you are going to stream video, would you use TCP or UDP? Give an explanation for both stored video and live video-streams". To this question they simply expected a short answer of TCP for stored video and UDP for live video, but I thought about this on my way home, and is it necessarily better to use UDP for streaming live video? I mean, if you have the bandwidth for it, and say you are streaming a soccer match, or concert for that matter, do you really need to use UDP?

Lets say that while you are streaming this concert or whatever using TCP you start losing packets (something bad happened in some network between you and the sender), and for a whole minute you don't get any packets. The video-stream will pause, and after the minute is gone packets start to get through again (IP found a new route for you). What would then happen is that TCP would retransmit the minute you lost and continue sending you the live stream. As an assumption the bandwidth is higher than the bit-rate on the stream, and the ping is not too high, so in a short amount of time, the one minute you lost will act as a buffer for the stream for you, that way, if packet-loss happens again, you won't notice.

Now, I can think of some appliances where this wouldn't be a good idea, like for instance video-conferences, where you need to always be at the end of the stream, because delay during a video-chat is just horrible, but during a soccer-match, or a concert what does it matter if you are a single minute behind the stream? Plus, you are guaranteed that you get all the data and it would be better to save for later viewing when it's coming in without any errors.

So this brings me to my question. Are there any drawbacks that I don't know of about using TCP for live-streaming? Or should it really be, that if you have the bandwidth for it you should go for TCP given that it is "nicer" to the network (flow-control)?


Source: (StackOverflow)

Video/audio streaming does not stop even if UIWebView is closed - iPad

I see this issue only on the iPad. The same things works as expected on the iPhone.

I am opening the URL from my application in a UIWebView. If the URL is a normal web page, it works fine as expected. But if the URL is that of a remote video/audio file, the UIWebView opens the default player which is again good.

Now when I dismiss the UIWebView (by clicking on the Done button on the player), the streaming doesn't stop and the audio/video keeps playing in the background (I cannot see it but it does keep playing in the background, can hear it). The UIViewController in which the webview was created is also dealloced (I put in a log statement in the dealloc method) but the streaming doesn't stop.

Can someone please help me out on why this could be happening? And how can I stop the audio/video streaming when the UIWebView is closed?

Thanks.


Source: (StackOverflow)

Streaming live camera video from iOS (iPhone/iPad) to remote PC / server

I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.

What appears after some googling and documentation browsing is that there are to main major standards/protocols that can be used:

  • Apple's HTTP Live Streaming (HLS)
  • Adobe's RTMP

Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance this or this). As of HLS, no encoding libraries seem to exist iOS side.

So my questions are:

  • Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
  • If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.

I thank you a very in advance dear friends.

Mehdi


Source: (StackOverflow)

VideoView onResume loses buffered portion of the video

I am having an Activity in which there is

  1. VideoView -- Streams a video from a webserver.

  2. Button -- Takes the user to the next activity to be shown.

When the application starts, VideoView is made to play the Video from a webserver.

Now assume

 Total Video length is 60 Minutes

 Current Video progress is 20 Minutes

 Current Buffered progress 30 Minutes 

Now when I click on the above mentioned Button which takes user to the next activity.

From that Activity if i press the back button, Previous Activity(with VideoView and Button) appears in front of the user. But when resumed all the Buffered Portion of the video is lost and hence the VideoView starts playing the video from the beginning which is really bad. <-- Actual Problem

Problem

When Activity is resumed back, the buffered portion of the video is lost and hence starts buffering it again. So how to overcome re-buffering the buffered portion of the Video ?

Even official Youtube android app. has the same problem.

Edit 1 :

I tried the below code in Activity but its not working.

@Override
protected void onPause() {
    // TODO Auto-generated method stub
    super.onPause();
    videoView.suspend();
}

@Override
protected void onResume() {
    // TODO Auto-generated method stub
    super.onResume();
    videoView.resume();
}

Can anyone guide me regarding this problem ?. Or am I missing something to make this work perfectly ?

Current Workaround

I have saved the current playing position of the video in onPause() method and in onResume() method I have used that position to seek the video to that duration. This works fine. But the video buffering starts from the beginning tho it starts the video from the seek position.

Any help is deeply appreciated.


Source: (StackOverflow)

Video Streaming and Android

Today for one of my app (Android 2.1), I wanted to stream a video from an URL.

As far as I explored Android SDK it's quite good and I loved almost every piece of it. But now that it comes to video stream I am kind of lost.

For any information you need about Android SDK you have thousands of blogs telling you how to do it. When it comes to video streaming, it's different. Informations is that abundant.

Everyone did it it's way tricking here and there.

Is there any well-know procedure that allows one to stream a video?

Did google think of making it easier for its developers?


Source: (StackOverflow)

Access to the iOS' video decoder?

The iPad/iOS has video streaming support for e.g. H.264 using MPMoviePlayerController etc., but i receive H.264 data through a custom, proprietary, stream and need to decode it in a soft real-time scenario.
Can the iPads/iOS' video decoder be accessed in any way to decode this data?

Update: Apparently the iOS 4.0 Core Media Framework supports decoding frames and knows of H.264, but there is no sample code nor can i see what i actually am supposed to call for the actual decoding.


Source: (StackOverflow)

How do you configure S3 and Cloud Front to stream HTML5 video? Tried everything

I've tried many, many different configurations, files, encoding, browsers, etc..., but this is the simplest example that demonstrates the problem I am having.

If you paste the url for the sample video for JSPlayer in FF 8.0.1, the video plays inline:

http://video-js.zencoder.com/oceans-clip.webm

If I take that same video and upload it to my s3 bucket, it triggers download instead:

https://s3.amazonaws.com/turingvideos/oceans-clip.webm -- or -- http

(Permissions are read for everyone on the file and bucket)

So, let's try Cloud Front.

d2yat6m71lu23b dot cloudfront dot net slash oceans-clip.webm (download trigger)

And Cloud Front streaming:

strzsu4h2ax96 dot cloudfront dot net slash oceans-clip.webm (infinite spinner)

The same basic things happen when using an html video tag as well. Works fine from zencoder, borked on anything other than local disk read.

So, what magic is zencoder managing that is completely out of my reach with S3/CloudFront? I'm completely stumped.

Edit:

Setting the content type and disposition to "video/webm" and "inline" did the trick. Thanks for the quick response guys.


Source: (StackOverflow)

Creating an MJPEG video stream in c#

I have images being sent to my database from a remote video source at about 5 frames per second as JPEG images. I am trying to figure out how to get those images into a video format so I can stream a live video feed to Silverlight.

It seems to make sense to create a MJPEG stream but I'm having a few problems. Firstly I was trying to stream via an HTTP request so I didn't have a deal with sockets but maybe this is breaking my code.

If I try surf to my stream from QT I get a video error, Media player shows the first frame image and Silverlight crashes :)

Here is the code that streams - since I content type used this way can only be sent once I know that it isn't ideal and might be the root cause. All images are coming in via a LINQ2SQL object.

I did already try simply updating the image source of an image control in Silverlight but the flicker isn't acceptable. If Silverlight doesn't support MJPEG then no point even continuing but it looks like it does. I do have access to the h.264 frames coming in but that seemed more complicated via MP4.

    Response.Clear();
    Response.ContentType = "multipart/x-mixed-replace; boundary=--myboundary";
    ASCIIEncoding ae = new ASCIIEncoding();
    HCData data = new HCData();
    var videos = (from v in data.Videos
                 select v).Take(50); // sample the first 50 frames
    foreach (Video frame in videos)
    {
        byte[] boundary = ae.GetBytes("\r\n--myboundary\r\nContent-Type: image/jpeg\r\nContent-Length:" + frame.VideoData.ToArray().Length + "\r\n\r\n");
        var mem = new MemoryStream(boundary);
        mem.WriteTo(Response.OutputStream);
        mem = new MemoryStream(frame.VideoData.ToArray());
        mem.WriteTo(Response.OutputStream);
        Response.Flush();
        Thread.Sleep(200);
    }

Thanks!

EDIT: I have the stream working in firefox so if I surf to the page I see video! but nothing else accepts the format. Not IE, SL, Media player - nothing.


Source: (StackOverflow)