EzDevInfo.com

camera interview questions

Top camera frequently asked interview questions

iPhone: Camera Preview Overlay

How do I add an overlay (UIImageView) to the camera preview and handle touches on this?

My previous attempts to do this (e.g. use UIImagePickerController and add the image as a subview) have failed.


Source: (StackOverflow)

Capture Image from Camera and Display in Activity

I want to write a module where on a click of a button the camera opens and I can click and capture an image. If I don't like the image I can delete it and click one more image and then select the image and it should return back and display that image in the activity.


Source: (StackOverflow)

Advertisements

Using the camera activity in Android

If you want to use the built-in camera activity which uses the native Android camera, simply do the following.

Intent camera = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);   
        this.startActivityForResult(camera, PICTURE_RESULT);

You want to get the images back from the nifty camera you displayed -- but how?


Source: (StackOverflow)

method called after release() exception unable to resume with android camera

While developing a camera app I've encountered an exception that only happened when I switch to other app (onPause() for my app).

01-15 17:22:15.017: E/AndroidRuntime(14336): FATAL EXCEPTION: main
01-15 17:22:15.017: E/AndroidRuntime(14336): java.lang.RuntimeException: Method called after release()
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.hardware.Camera.setPreviewDisplay(Native Method)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.hardware.Camera.setPreviewDisplay(Camera.java:357)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at com.sora.cbir.yuki.image.leaf.CameraPreview.surfaceCreated(CameraPreview.java:32)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.view.SurfaceView.updateWindow(SurfaceView.java:551)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.view.SurfaceView.onWindowVisibilityChanged(SurfaceView.java:213)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.view.View.dispatchWindowVisibilityChanged(View.java:4075)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.view.ViewGroup.dispatchWindowVisibilityChanged(ViewGroup.java:742)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.view.ViewGroup.dispatchWindowVisibilityChanged(ViewGroup.java:742)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.view.ViewGroup.dispatchWindowVisibilityChanged(ViewGroup.java:742)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.view.ViewGroup.dispatchWindowVisibilityChanged(ViewGroup.java:742)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.view.ViewRoot.performTraversals(ViewRoot.java:858)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.view.ViewRoot.handleMessage(ViewRoot.java:1995)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.os.Handler.dispatchMessage(Handler.java:99)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.os.Looper.loop(Looper.java:150)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at android.app.ActivityThread.main(ActivityThread.java:4389)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at java.lang.reflect.Method.invokeNative(Native Method)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at java.lang.reflect.Method.invoke(Method.java:507)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:849)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:607)
01-15 17:22:15.017: E/AndroidRuntime(14336):    at dalvik.system.NativeStart.main(Native Method)

I did some research and found out that I need to add

mCamera.setPreviewCallback(null);

as a workaround for Android's camera stack

my onPause() now looks like this:

@Override
protected void onPause() {
    super.onPause();
    try
    {    
        // release the camera immediately on pause event   
        //releaseCamera();
         mCamera.stopPreview(); 
         mCamera.setPreviewCallback(null);
         mCamera.release();
         mCamera = null;

    }
    catch(Exception e)
    {
        e.printStackTrace();
    }
}

and my onResume():

@Override
protected void onResume()
{
    super.onResume();
    try
    {
        mCamera.setPreviewCallback(null);
        mCamera = getCameraInstance();
        //mCamera.setPreviewCallback(null);
        mPreview = new CameraPreview(Imageupload.this, mCamera);//set preview
        preview.addView(mPreview);
    } catch (Exception e){
        Log.d(TAG, "Error starting camera preview: " + e.getMessage());
    }
}   
}

and finally my getCameraInstance() method:

public Camera getCameraInstance(){
    Camera camera = null;
    try {
        camera = Camera.open(); // attempt to get a Camera instance
    }
    catch (Exception e){
        // Camera is not available (in use or does not exist)
    }
    Camera.Parameters parameters = camera.getParameters();
    //mPreviewSize = getBestPreviewSize(parameters, wt, ht);
    //mPictureSize = getBestPictureSize(parameters, wt, ht);
    //Shift W & H => if camera rotates 90 deg

    mPreviewSize = getOptimalPreviewSize(parameters, wt, ht); //original => wt,ht
    mPictureSize = getOptimalPictureSize(parameters, wt, ht); //original => wt,ht

    Log.d("CAMERA", "SCREEN RESOLUTION H: "+ht);
    Log.d("CAMERA", "SCREEN RESOLUTION W: "+wt);

    Log.d("CAMERA", "PREVIEW RESOLUTION H: "+mPreviewSize.height);
    Log.d("CAMERA", "PREVIEW RESOLUTION W: "+mPreviewSize.width);

    Log.d("CAMERA", "PICTURE RESOLUTION H: "+mPictureSize.height);
    Log.d("CAMERA", "PICTURE RESOLUTION W: "+mPictureSize.width);
    //set preview size based on device screen
    parameters.setPreviewSize(mPreviewSize.width, mPreviewSize.height);
    //set picture size based on device screen
    parameters.setPictureSize(mPictureSize.width, mPictureSize.height);
    //set output camera mode
    parameters.setPictureFormat(PixelFormat.JPEG);
    //set focous mode
    parameters.setFocusMode(FOCUS_MODE_AUTO);
    //set flash mode
    parameters.setFlashMode("auto");
    List<int[]> fps = parameters.getSupportedPreviewFpsRange();
    //System.out.println("FPS size: " +fps.size());
    //System.out.println("MAX FPS:"+(fps.get(fps.size()-1)[1])/1000);
    //log min and max camera supported fps
    Log.d("CAMERA", "CAMERA MAX FPS: "+(fps.get(fps.size()-1)[1])/1000);
    Log.d("CAMERA", "CAMERA MIN FPS: "+(fps.get(fps.size()-1)[0])/1000);
    if(camera_fps)
    {
        parameters.setPreviewFpsRange(fps.get(fps.size()-1)[1], fps.get(fps.size()-1)[1]);
    }
    //set camera parameters
    camera.setParameters(parameters);

    Toast.makeText(getApplicationContext(), "Your device are capable of previewing @" + fps.get(fps.size()-1)[1]/1000+"fps!",Toast.LENGTH_SHORT).show();
    return camera; // returns null if camera is unavailable
}

any ideas on how to fix this?


Source: (StackOverflow)

Where can I get the Android camera application source code?

I'm referring to the camera application that is already installed on my G1, not the camera API source code.

How do I get the source code?


Source: (StackOverflow)

Access the camera with iPhone SDK

It seems obvious that some people have been able to figure out how to access the iPhone camera through the SDK (Spore Origins, for example), but I haven't been able to find any helpful information. I don't want anyone to violate their NDA, but does anyone know of any existing (official) resources that show how this can be done? Thanks.


Source: (StackOverflow)

How to save picture to iPhone photo library?

What do I need to do to save an image my program has generated (possibly from the camera, possibly not) to the system photo library on the iPhone?


Source: (StackOverflow)

HTML5 Camera Access Through Browser in iOS

We are creating an html5 website for mobile and need to get camera access through the web browser without being a native app. We are having trouble making this work in iOS. Is anyone aware of a solution for this?


Source: (StackOverflow)

Android save view to jpg or png

I would like to write an android app that basically layers an overlay on image on another image and then I would like to save the picture with the overlay as a jpg or png. Basically this will be the whole view that I would like to save.

Sample code would be very helpful.

EDIT:

I tried out your suggestions and am getting a null pointer at the Starred Line.

 import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;

import android.app.Activity;
import android.graphics.Bitmap;
import android.graphics.Bitmap.CompressFormat;
import android.os.Bundle;
import android.os.Environment;
import android.widget.LinearLayout;
import android.widget.TextView;

    public class EditPhoto extends Activity {
        /** Called when the activity is first created. */
     LinearLayout ll = null;
     TextView tv = null;
        @Override
        public void onCreate(Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            setContentView(R.layout.main);
            tv = (TextView) findViewById(R.id.text);
            ll = (LinearLayout) findViewById(R.id.layout);
            ll.setDrawingCacheEnabled(true);
            Bitmap b = ll.getDrawingCache();
            File sdCard = Environment.getExternalStorageDirectory();
            File file = new File(sdCard, "image.jpg");
            FileOutputStream fos;
      try {
       fos = new FileOutputStream(file);
       *** b.compress(CompressFormat.JPEG, 95,fos);
      } catch (FileNotFoundException e) {
       // TODO Auto-generated catch block
       e.printStackTrace();
      }

        }
    }

Source: (StackOverflow)

Is there a good tutorial for implementing an augmented reality iPhone application? [closed]

Are there any good tutorials or sample applications out there that demonstrate how to make an augmented reality iPhone application?


Source: (StackOverflow)

QR code (2D barcode) coding and decoding algorithms? [closed]

Looking for free/opensource code or description of algorithms to code (simple) and decode (hard) the 2D barcode QR code.

It doesn't seem like a trivial problem, but it's so popular in Japan that there must be something already available...


Source: (StackOverflow)

iOS Heart rate detection Algorithm

I'm trying to implement heart beat recording functionality in an app i'm developing.

The preferred method of doing this is by using the iPhone's camera with the light on, having the user place their finger on the lens, and detecting fluctuations in the video feed, which correspond to the user's heart.

I found a very good starting point with the following stack overflow question here

The question provides useful code to plot a heart beat time graph.

It shows how to start an AVCaptureSession and turn the camera's light on like so:

session = [[AVCaptureSession alloc] init];

AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if([camera isTorchModeSupported:AVCaptureTorchModeOn]) {
    [camera lockForConfiguration:nil];
    camera.torchMode=AVCaptureTorchModeOn;
    //  camera.exposureMode=AVCaptureExposureModeLocked;
    [camera unlockForConfiguration];
}
// Create a AVCaptureInput with the camera device
NSError *error=nil;
AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
if (cameraInput == nil) {
    NSLog(@"Error to create camera capture:%@",error);
}

// Set the output
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];

// create a queue to run the capture on
dispatch_queue_t captureQueue=dispatch_queue_create("catpureQueue", NULL);

// setup our delegate
[videoOutput setSampleBufferDelegate:self queue:captureQueue];

// configure the pixel format
videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
                             nil];
videoOutput.minFrameDuration=CMTimeMake(1, 10);

// and the size of the frames we want
[session setSessionPreset:AVCaptureSessionPresetLow];

// Add the input and output
[session addInput:cameraInput];
[session addOutput:videoOutput];

// Start the session
[session startRunning];

Self in this example must be an <AVCaptureVideoDataOutputSampleBufferDelegate> And will therefore have to implement the following method to obtain raw camera data:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
static int count=0;
count++;
// only run if we're not already processing an image
// this is the image buffer
CVImageBufferRef cvimgRef = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the image buffer
CVPixelBufferLockBaseAddress(cvimgRef,0);
// access the data
int width=CVPixelBufferGetWidth(cvimgRef);
int height=CVPixelBufferGetHeight(cvimgRef);
// get the raw image bytes
uint8_t *buf=(uint8_t *) CVPixelBufferGetBaseAddress(cvimgRef);
size_t bprow=CVPixelBufferGetBytesPerRow(cvimgRef);
float r=0,g=0,b=0;
for(int y=0; y<height; y++) {
    for(int x=0; x<width*4; x+=4) {
        b+=buf[x];
        g+=buf[x+1];
        r+=buf[x+2];
        //          a+=buf[x+3];
    }
    buf+=bprow;
}
r/=255*(float) (width*height);
g/=255*(float) (width*height);
b/=255*(float) (width*height);

float h,s,v;

RGBtoHSV(r, g, b, &h, &s, &v);

// simple highpass and lowpass filter 

static float lastH=0;
float highPassValue=h-lastH;
lastH=h;
float lastHighPassValue=0;
float lowPassValue=(lastHighPassValue+highPassValue)/2;

lastHighPassValue=highPassValue;

    //low pass value can now be used for basic heart beat detection


}

RGB is converted to HSV and it is Hue that is monitored for fluctuations.

And RGB to HSV is implemented as follows

void RGBtoHSV( float r, float g, float b, float *h, float *s, float *v ) {
float min, max, delta; 
min = MIN( r, MIN(g, b )); 
max = MAX( r, MAX(g, b )); 
*v = max;
delta = max - min; 
if( max != 0 )
    *s = delta / max;
else {
    // r = g = b = 0 
    *s = 0; 
    *h = -1; 
    return;
}
if( r == max )
    *h = ( g - b ) / delta; 
else if( g == max )
    *h=2+(b-r)/delta;
else 
    *h=4+(r-g)/delta; 
*h *= 60;
if( *h < 0 ) 
    *h += 360;
}

The low pass value calculated in capureOutput: initially provides erratic data, but then stabilises to the following:

2013-11-04 16:18:13.619 SampleHeartRateApp[1743:1803] -0.071218
2013-11-04 16:18:13.719 SampleHeartRateApp[1743:1803] -0.050072
2013-11-04 16:18:13.819 SampleHeartRateApp[1743:1803] -0.011375
2013-11-04 16:18:13.918 SampleHeartRateApp[1743:1803] 0.018456
2013-11-04 16:18:14.019 SampleHeartRateApp[1743:1803] 0.059024
2013-11-04 16:18:14.118 SampleHeartRateApp[1743:1803] 0.052198
2013-11-04 16:18:14.219 SampleHeartRateApp[1743:1803] 0.078189
2013-11-04 16:18:14.318 SampleHeartRateApp[1743:1803] 0.046035
2013-11-04 16:18:14.419 SampleHeartRateApp[1743:1803] -0.113153
2013-11-04 16:18:14.519 SampleHeartRateApp[1743:1803] -0.079792
2013-11-04 16:18:14.618 SampleHeartRateApp[1743:1803] -0.027654
2013-11-04 16:18:14.719 SampleHeartRateApp[1743:1803] -0.017288

An example of the erratic data provided initially is here:

2013-11-04 16:17:28.747 SampleHeartRateApp[1743:3707] 17.271435
2013-11-04 16:17:28.822 SampleHeartRateApp[1743:1803] -0.049067
2013-11-04 16:17:28.922 SampleHeartRateApp[1743:1803] -6.524201
2013-11-04 16:17:29.022 SampleHeartRateApp[1743:1803] -0.766260
2013-11-04 16:17:29.137 SampleHeartRateApp[1743:3707] 9.956407
2013-11-04 16:17:29.221 SampleHeartRateApp[1743:1803] 0.076244
2013-11-04 16:17:29.321 SampleHeartRateApp[1743:1803] -1.049292
2013-11-04 16:17:29.422 SampleHeartRateApp[1743:1803] 0.088634
2013-11-04 16:17:29.522 SampleHeartRateApp[1743:1803] -1.035559
2013-11-04 16:17:29.621 SampleHeartRateApp[1743:1803] 0.019196
2013-11-04 16:17:29.719 SampleHeartRateApp[1743:1803] -1.027754
2013-11-04 16:17:29.821 SampleHeartRateApp[1743:1803] 0.045803
2013-11-04 16:17:29.922 SampleHeartRateApp[1743:1803] -0.857693
2013-11-04 16:17:30.021 SampleHeartRateApp[1743:1803] 0.061945
2013-11-04 16:17:30.143 SampleHeartRateApp[1743:1803] -0.701269

The low pass value goes positive whenever there is a heart beat. So I tried a very simple live detection algorithm which basically looks at the current value, and sees if it is positive, it also looks at the previous value, if negative it detects negative going to positive and plays a beep sound.

The problem with this is the data isn't always as perfect as the above, sometimes there's anomalous positive readings in amongst negative readings and vice versa.

A graph of the low pass value in time looks like this: enter image description here

Interestingly the above anomaly is quite common, if I record a graph for a while i'll see a very similar shaped anomaly multiple times.

In my very simple beat detection algorithm, if an anomaly as shown above occurs the counted number of beats in the detection period (10 seconds) can shoot up by 4 or 5 beats. This makes the calculated BPM very inaccurate. But as simple as it is it does work around 70% of the time.

To combat this problem I tried the following.

1.Started recording last 3 low pass values in an array

2.Then looked to see whether or not the middle value had two smaller values surrounding it before and after. (Basic peak detection)

3.Counted this scenario as a beat and added it to the running total of beats in a given time.

This method is however just as vulnerable to the anomalies as any other. And actually seemed to be a worse method. (When playing live beeps after detection they seemed far more erratic than the positive to negative algorithm)

My question is can you help me come up with an algorithm that can reliably detect when a heart beat occurs with reasonable accuracy.

Another problem I realise that i'm going to have to address is detecting whether or not a user's finger is on the lens.

I thought about detecting erratic low pass values but the problem there is the low pass filter accounts for erratic values and smooths them out over time. So help there would be appreciated too.

Thanks for your time.


Source: (StackOverflow)

getUserMedia() shim for PhoneGap/Cordova?

I have created a web app with Cordova and I need to show a live camera stream in my background. It seems that the Camera/Videos APIs from Cordova just open the native Camera/Video apps instead of returning live camera data. What I really need is something like getUserMedia() which is only available on Opera and Chrome (June 2012).

Is there a shim to use getUserMedia() within Cordova or any Plugins which behave similarly?


Source: (StackOverflow)

How can I set camera preview size to squared aspect ratio in a squared SurfaceView (like Instagram)

I'm trying to develop my own camera activity, but I have a problem that I'm not unable to solve...

What I want, is something very similiar to instagram photo frame, and this is what I get:

My image

When I should get something like this:

Instagram image

and...

My second image

when I should get something like:

Instagram 2

I think I'm maanaging the SurfaceView and Camera preview well, only using

Camera.Parameters parameters = camera.getParameters();
camera.setDisplayOrientation(90);

and Custom SurfaceView:

public class SquaredSurfaceView extends SurfaceView {

private int width;
private int height;

public SquaredSurfaceView(Context context) {
    super(context);
}

public SquaredSurfaceView(Context context, AttributeSet attrs) {
    super(context, attrs);
}

public SquaredSurfaceView(Context context, AttributeSet attrs, int defStyle) {
    super(context, attrs, defStyle);
}

@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
    super.onMeasure(widthMeasureSpec, heightMeasureSpec);
    width = MeasureSpec.getSize(widthMeasureSpec);
    height = width;
    setMeasuredDimension(width, width);
}

public int getViewWidth() {
    return width;
}

public int getViewHeight() {
    return height;
}

}

What I'm doing wrong?? :-(


Source: (StackOverflow)

How SurfaceHolder callbacks are related to Activity lifecycle?

I've been trying to implement an application that requires camera preview on a surface. As I see the things, both activity and surface lifecycles consist of the following states:

  1. When I first launch my Activity: onResume()->onSurfaceCreated()->onSurfaceChanged()
  2. When I leave my Activity: onPause()->onSurfaceDestroyed()

In this scheme, I can do corresponding calls like open/release camera and start/stop preview in onPause/onResume and onSurfaceCreated()/onSurfaceDestroyed().

It works fine, unless I lock the screen. When I launch the app, then lock the screen and unlock it later I see:

onPause() - and nothing else after the screen is locked - then onResume() after unlock - and no surface callbacks after then. Actually, onResume() is called after the power button is pressed and the screen is on, but the lock screen is still active, so, it's before the activity becomes even visible.

With this scheme, I get a black screen after unlock, and no surface callbacks are called.

Here's a code fragment that doesn't involve actual work with the camera, but the SurfaceHolder callbacks. The issue above is reproduced even with this code on my phone (callbacks are called in a normal sequence when you press "Back" button, but are missing when you lock the screen):

class Preview extends SurfaceView implements SurfaceHolder.Callback {

    private static final String tag= "Preview";

    public Preview(Context context) {
        super(context);
        Log.d(tag, "Preview()");
        SurfaceHolder holder = getHolder();
        holder.addCallback(this);
        holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    }

    public void surfaceCreated(SurfaceHolder holder) {
        Log.d(tag, "surfaceCreated");
    }

    public void surfaceDestroyed(SurfaceHolder holder) {
        Log.d(tag, "surfaceDestroyed");
    }

    public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
        Log.d(tag, "surfaceChanged");
    }
}

Any ideas on why the surface remains undestroyed after the Activity is paused? Also, how do you handle camera lifecycle in such cases?


Source: (StackOverflow)