EzDevInfo.com

mp4parser

A Java API to read, write and create MP4 files

mp4parse on Android error - MovieCreator.build() : "java.lang.RuntimeException: No box object found for ftyp"

i'm using mp4parser lib. Error at MovieCreator.build(new FileInputStream(f1).getChannel()) with exception : RuntimeException: No box object found for ftyp . Plz help me!


Source: (StackOverflow)

app crash when try to combine mp4 videos

I download the latest source code for mp4parser and use this to combine two mp4 files. When I try to combine two mp4 videos app crashed.

when I run this line of code

  MovieCreator.build("mnt/sdcard/CamVideo/0.mp4")

The app crashed with and crash log is

  08-11 17:27:49.023: E/AndroidRuntime(24864): FATAL EXCEPTION: main
  08-11 17:27:49.023: E/AndroidRuntime(24864): java.lang.ArrayIndexOutOfBoundsException:             length=0; index=0
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at com.googlecode.mp4parser.authoring.samples.DefaultMp4SampleList.<init>(DefaultMp4SampleList.java:52)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at com.coremedia.iso.boxes.mdat.SampleList.<init>(SampleList.java:33)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at com.googlecode.mp4parser.authoring.Mp4TrackImpl.<init>(Mp4TrackImpl.java:59)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at com.googlecode.mp4parser.authoring.container.mp4.MovieCreator.build(MovieCreator.java:58)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at com.googlecode.mp4parser.authoring.container.mp4.MovieCreator.build(MovieCreator.java:39)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at com.coderzheaven.pack.CustomCameraActivity.MergeVideos(CustomCameraActivity.java:200)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at com.coderzheaven.pack.CustomCameraActivity$1.onClick(CustomCameraActivity.java:108)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at android.view.View.performClick(View.java:4274)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at android.view.View$PerformClick.run(View.java:17357)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at android.os.Handler.handleCallback(Handler.java:615)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at android.os.Handler.dispatchMessage(Handler.java:92)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at android.os.Looper.loop(Looper.java:137)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at android.app.ActivityThread.main(ActivityThread.java:4949)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at java.lang.reflect.Method.invokeNative(Native Method)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at java.lang.reflect.Method.invoke(Method.java:511)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1043)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:810)
  08-11 17:27:49.023: E/AndroidRuntime(24864):  at dalvik.system.NativeStart.main(Native Method)

I am using the same properties files that is in sample code "isoparser-custom.properties" and "isoparser-default.properties"

I am expecting movie object should build properly. Can anyone tell me what I am doing wrong in this or can send me fully function code that merge two mp4 files.

EDITED: I download this file "isoparser-1.0.2.jar" from maven project (http://repo1.maven.org/maven2/com/googlecode/mp4parser/isoparser/1.0.2/) and but app crashed with this crash report

  08-18 17:32:20.571: E/AndroidRuntime(31446): FATAL EXCEPTION: main
  08-18 17:32:20.571: E/AndroidRuntime(31446): java.lang.ExceptionInInitializerError
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at java.lang.Class.classForName(Native Method)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at java.lang.Class.forName(Class.java:217)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at java.lang.Class.forName(Class.java:172)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.coremedia.iso.PropertyBoxParserImpl.createBox(PropertyBoxParserImpl.java:86)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.coremedia.iso.AbstractBoxParser.parseBox(AbstractBoxParser.java:102)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.googlecode.mp4parser.BasicContainer.next(BasicContainer.java:155)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.googlecode.mp4parser.BasicContainer.hasNext(BasicContainer.java:131)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.googlecode.mp4parser.util.LazyList$1.hasNext(LazyList.java:55)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.coremedia.iso.IsoFile.getMovieBox(IsoFile.java:109)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.googlecode.mp4parser.authoring.container.mp4.MovieCreator.build(MovieCreator.java:48)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.googlecode.mp4parser.authoring.container.mp4.MovieCreator.build(MovieCreator.java:35)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.coderzheaven.pack.CustomCameraActivity.MergeVideos(CustomCameraActivity.java:200)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.coderzheaven.pack.CustomCameraActivity$1.onClick(CustomCameraActivity.java:108)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at android.view.View.performClick(View.java:3517)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at android.view.View$PerformClick.run(View.java:14155)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at android.os.Handler.handleCallback(Handler.java:605)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at android.os.Handler.dispatchMessage(Handler.java:92)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at android.os.Looper.loop(Looper.java:137)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at android.app.ActivityThread.main(ActivityThread.java:4666)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at java.lang.reflect.Method.invokeNative(Native Method)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at java.lang.reflect.Method.invoke(Method.java:511)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:809)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:576)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at dalvik.system.NativeStart.main(Native Method)
  08-18 17:32:20.571: E/AndroidRuntime(31446): Caused by: java.lang.NoClassDefFoundError: org.aspectj.runtime.reflect.Factory
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.coremedia.iso.boxes.FileTypeBox.ajc$preClinit(FileTypeBox.java:1)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  at com.coremedia.iso.boxes.FileTypeBox.<clinit>(FileTypeBox.java:1)
  08-18 17:32:20.571: E/AndroidRuntime(31446):  ... 24 more

Source: (StackOverflow)

Advertisements

Unable to transcode audio via Android MediaCodec API

I'm trying to write a basic raw AAC data to a file, in hopes that I can use mp4parser to encapsulate it with a video track. For that, I need to encode any given audio file to that format. MediaCodec API is readily available since API 16, so I've decided to use that for the codec operation.

I'm not sure why not many resources are available online regarding this, possibly due to the complexity associated. Although, I've managed to learn that the fundamental approach should be:

Get sample data via MediaExtractor -> Enqueue decoder input buffer -> Dequeue output buffer and get the decoded data -> Enqueue encoder input buffer -> Dequeue encoder output buffer -> Write the encoded data to file.

private void transcodeFile(File source, File destination) throws IOException {
    FileInputStream inputStream = new FileInputStream(source);
    FileOutputStream outputStream = new FileOutputStream(destination);

    log("Transcoding file: " + source.getName());

    MediaExtractor extractor;
    MediaCodec encoder;
    MediaCodec decoder;

    ByteBuffer[] encoderInputBuffers;
    ByteBuffer[] encoderOutputBuffers;
    ByteBuffer[] decoderInputBuffers;
    ByteBuffer[] decoderOutputBuffers;

    int noOutputCounter = 0;
    int noOutputCounterLimit = 10;

    extractor = new MediaExtractor();
    extractor.setDataSource(inputStream.getFD());
    extractor.selectTrack(0);

    log(String.format("TRACKS #: %d", extractor.getTrackCount()));
    MediaFormat format = extractor.getTrackFormat(0);
    String mime = format.getString(MediaFormat.KEY_MIME);
    log(String.format("MIME TYPE: %s", mime));


    final String outputType = MediaFormat.MIMETYPE_AUDIO_AAC;
    encoder = MediaCodec.createEncoderByType(outputType);
    MediaFormat encFormat = MediaFormat.createAudioFormat(outputType, 44100, 2);
    encFormat.setInteger(MediaFormat.KEY_BIT_RATE, 64000);
    encoder.configure(encFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

    decoder = MediaCodec.createDecoderByType(mime);
    decoder.configure(format, null, null, 0);

    encoder.start();
    decoder.start();

    encoderInputBuffers = encoder.getInputBuffers();
    encoderOutputBuffers = encoder.getOutputBuffers();

    decoderInputBuffers = decoder.getInputBuffers();
    decoderOutputBuffers = decoder.getOutputBuffers();

    int timeOutUs = 1000;
    long presentationTimeUs = 0;

    MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
    boolean inputEOS = false;
    boolean outputEOS = false;

    while(!outputEOS && noOutputCounter < noOutputCounterLimit) {
        noOutputCounter++;

        if(!inputEOS) {
            int decInputBufferIndex = decoder.dequeueInputBuffer(timeOutUs);
            log("decInputBufferIndex: " + decInputBufferIndex);
            if (decInputBufferIndex >= 0) {
                ByteBuffer dstBuffer = decoderInputBuffers[decInputBufferIndex];

                //Getting sample with MediaExtractor
                int sampleSize = extractor.readSampleData(dstBuffer, 0);
                if (sampleSize < 0) {
                    inputEOS = true;
                    log("Input EOS");
                    sampleSize = 0;
                } else {
                    presentationTimeUs = extractor.getSampleTime();
                }

                log("Input sample size: " + sampleSize);

                //Enqueue decoder input buffer
                decoder.queueInputBuffer(decInputBufferIndex, 0, sampleSize, presentationTimeUs, inputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0);
                if (!inputEOS) extractor.advance();

            } else {
                log("decInputBufferIndex: " + decInputBufferIndex);
            }
        }

        //Dequeue decoder output buffer
        int res = decoder.dequeueOutputBuffer(info, timeOutUs);
        if(res >= 0) {
            if(info.size > 0) noOutputCounter = 0;

            int decOutputBufferIndex = res;
            log("decOutputBufferIndex: " + decOutputBufferIndex);

            ByteBuffer buffer = decoderOutputBuffers[decOutputBufferIndex];
            buffer.position(info.offset);
            buffer.limit(info.offset + info.size);

            final int size = buffer.limit();
            if(size > 0) {
                //audioTrack.write(buffer, buffer.limit(), AudioTrack.MODE_STATIC);

                int encInputBufferIndex = encoder.dequeueInputBuffer(-1);
                log("encInputBufferIndex: " + encInputBufferIndex);
                //fill the input buffer with the decoded data
                if(encInputBufferIndex >= 0) {
                    ByteBuffer dstBuffer = encoderInputBuffers[encInputBufferIndex];
                    dstBuffer.clear();
                    dstBuffer.put(buffer);

                    encoder.queueInputBuffer(encInputBufferIndex, 0, info.size, info.presentationTimeUs, 0);
                    int encOutputBufferIndex = encoder.dequeueOutputBuffer(info, timeOutUs);
                    if(encOutputBufferIndex >= 0) {
                        log("encOutputBufferIndex: " + encOutputBufferIndex);
                        ByteBuffer outBuffer = encoderOutputBuffers[encOutputBufferIndex];
                        byte[] out = new byte[outBuffer.remaining()];
                        outBuffer.get(out);
                        //write data to file
                        outputStream.write(out);
                    }
                }
            }
            decoder.releaseOutputBuffer(decOutputBufferIndex, false);
            if((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                outputEOS = true;
                log("Output EOS");
            }
        } else if (res == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
            decoderOutputBuffers = decoder.getOutputBuffers();
            log("Output buffers changed.");
        } else if (res == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
            log("Output format changed.");
        } else {
            log("Dequeued output buffer returned: " + res);
        }
    }

    log("Stopping..");
    releaseCodec(decoder);
    releaseCodec(encoder);
    inputStream.close();
    outputStream.close();

}

The output file is not valid for some reason. Why?

EDIT: Managed to fix an Exception, issue persists.

EDIT 2: I've prevented the buffer overflow by setting the buffer size to the bitrate in the encoder format settings. There are two issues currently: 1. After a very short interval, it gets stuck here, possibly waiting indefinitely.int encInputBufferIndex = dequeueInputBuffer(-1); 2. Decoding takes as long as the track is, why does this regard for the actual interval of samples?

EDIT 3: Testing with AudioTrack.write(), audio plays nice and fine, but this isn't intended and suggests that the decoding is taking place in sync to the media file being fed, this should take place as fast as possible to allow encoder to do its job quick. Changing the presentationTimeUs in decoder.queueInputBuffer() did nothing.


Source: (StackOverflow)

Is there any way to compress a video in Android native app?

I am working on an Android native application with video recording. Everything works like a charm but video uploading is very slow.

I'm using mp4parser for Pause/Record. I tried reference of Video compression API but I couldn't find anything.

Is there any way, I can implement the video compression. Your earliest response would be appreciated.

Thank you all!!!!


Source: (StackOverflow)

Reverse video in android

I have recorded a video from camera in my app and saved in device storage.Now I want to reverse the video such that it plays from backwards.i.e. if video is of 10 seconds then the last frame at 10th second will become first frame and it starts playing from there to 1st second first frame.I want to save the reversed video in a file.How should i proceed in that?


Source: (StackOverflow)

Rotate video with Mp4parser

I need to rotate a video to adjust some of my needs. I'll explain the details on the following list.

I'm creating a Vine like app. I have to record video segments and then merge all the parts into just one file. I'm doing this without issue on an Android app using mp4parser library with last version 1.0-RC-26 using the example provided on their website: https://mp4parser.googlecode.com/svn/trunk/examples/src/main/java/com/googlecode/mp4parser/AppendExample.java.

The append video example works fine if all the videos have the same orientation but I discovered some issues recording video from the front camera so the quick solution was to set the video orientation recording on 270. The bad part on this solution is that the segment with this orientation appear with the wrong orientation on the merged video.

My possible solution to this is to rotate the video to apply what I need in different situations but I'm not having a working example with my code. Searching the internet I found solutions like this one MP4Parser change video orientation. The problem with this code is that is not compatible with the last version (It gives an compilation error) . I tried too to understand the logic of the library but I'm not having results. For example I experimented using the setMatrix instruction on the Movie object but It simply don't work.

public static void mergeVideo(int SegmentNumber) throws Exception {

    Log.d("PM", "Merge process started");
     Movie[] inMovies = new Movie[SegmentNumber]   ;
     //long[] Matrix = new long[SegmentNumber];

     for (int i = 1 ; i <= SegmentNumber; i++){
         File file =  new File(getCompleteFilePath(i));
         if (file.exists()){
             FileInputStream fis = new FileInputStream(getCompleteFilePath(i));
             //Set rotation I tried to experiment with this instruction but is not working
             inMovies [i-1].setMatrix(Matrix.ROTATE_90);
             inMovies [i-1] = MovieCreator.build(fis.getChannel());

             Log.d("PM", "Video " + i  + " merged" );
         }

         //fis.close();
     }


        List<Track> videoTracks = new LinkedList<Track>();
        List<Track> audioTracks = new LinkedList<Track>();

        for (Movie m : inMovies) {
            for (Track t : m.getTracks()) {
                if (t.getHandler().equals("soun")) {
                    audioTracks.add(t);
                }
                if (t.getHandler().equals("vide")) {
                    videoTracks.add(t);
                }
            }
        }

        Movie result = new Movie();

        if (audioTracks.size() > 0) {
            result.addTrack(new AppendTrack(audioTracks.toArray(new Track[audioTracks.size()])));
        }
        if (videoTracks.size() > 0) {
            result.addTrack(new AppendTrack(videoTracks.toArray(new Track[videoTracks.size()])));
        }

        Container out = new DefaultMp4Builder().build(result);

        //out.getMovieBox().getMovieHeaderBox().setMatrix(Matrix.ROTATE_180); //set orientation, default merged video have wrong orientation
        // Create a media file name
        //
        String filename =  getCompleteMergedVideoFilePath()  ;

        FileChannel fc = new RandomAccessFile(String.format(filename), "rw").getChannel();
        out.writeContainer(fc);
        fc.close();


        //don't leave until the file is on his place
        File file = new File (filename);
        do {
            if (! file.exists()){
                Log.d("PM", "Result file not ready");
            }
       } while (! file.exists() );
       //
        Log.d("PM", "Merge process finished");
}

Have someone rotated video with the very last version of Mp4parser? English is not my native language so I apologize any grammar error.


Source: (StackOverflow)

Combine Mp4s using mp4parser on Android

I was just wondering if anybody knows how to take an mp4 audio file and overlay it onto an mp4 video file using mp4parser on Android. I have been able to append one video with another, now I just need to overlay a raw mp4 that I have over the combined file.

Any help would be appreciated!


Source: (StackOverflow)

Issue in Cutting Multiple clips from a Movie with mp4parser Library

I am using mp4parser Library for cutting multiple clips from a recored video. It is working fine If I cut one part from the video. But when I try to cutt multiple clips from video only 1st clip is proper cut. Other are of just 0 or 1 second. Following is the My Code:

import android.app.ProgressDialog;
import android.content.Context;
import android.os.Handler;
import android.os.Message;
import android.util.Log;
import android.widget.Toast;

import com.coremedia.iso.IsoFile;
import com.coremedia.iso.boxes.TimeToSampleBox;
import com.googlecode.mp4parser.authoring.Movie;
import com.googlecode.mp4parser.authoring.Track;
import com.googlecode.mp4parser.authoring.builder.DefaultMp4Builder;
import com.googlecode.mp4parser.authoring.container.mp4.MovieCreator;
import com.googlecode.mp4parser.authoring.tracks.CroppedTrack;

import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.channels.FileChannel;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Date;
import java.util.LinkedList;
import java.util.List;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;

import uk.org.humanfocus.hfi.Beans.TrimPoint;
import uk.org.humanfocus.hfi.Utils.Constants;
import uk.org.humanfocus.hfi.Utils.SimpleThreadFactory;
import uk.org.humanfocus.hfi.Utils.Ut;

/**
 * Shortens/Crops a track
 */
public class ShortenExample {

    private static final String TAG = "ShortenExample";
    private final Context mCxt;
    private ExecutorService mThreadExecutor = null;
    private SimpleInvalidationHandler mHandler;
    private ProgressDialog mProgressDialog;
    String filePath;
    ArrayList<TrimPoint> mTrimPoints;
    int videoLength;
    ArrayList<String> trimVideos;
    private class SimpleInvalidationHandler extends Handler {

        @Override
        public void handleMessage(final Message msg) {
            switch (msg.what) {
            case R.id.shorten:
                mProgressDialog.dismiss();

                if (msg.arg1 == 0)
                    Toast.makeText(mCxt,
                            mCxt.getString(R.string.message_error) + " " + (String) msg.obj,
                            Toast.LENGTH_LONG).show();
                else
                    Toast.makeText(mCxt,
                            mCxt.getString(R.string.message_shortened) + " " + (String) msg.obj,
                            Toast.LENGTH_LONG).show();
                break;
            }
        }
    }

    public ShortenExample(Context context) {
        mCxt = context;
        mHandler = new SimpleInvalidationHandler();
        //mProgressDialog = new ProgressDialog(mCxt);
        //mProgressDialog.setMessage("Wait Saving..");
        //mProgressDialog.setCancelable(false);
    }

    public void shorten(String filePath,ArrayList<TrimPoint> trimPoints, int endTime) {
        trimVideos = new ArrayList<String>();
        this.filePath = filePath;
        this.videoLength = endTime;
        this.mTrimPoints = trimPoints;
        Log.d(Constants.TAG,"End Time: "+endTime+" Trim Points: "+mTrimPoints.size());
        for (int i=0;i<trimPoints.size();i++){
            TrimPoint point = trimPoints.get(i);
            int start=0;
            int end = 0;
            if(point.getTime()-5<0){
                start = 0;
            }else{
                start = point.getTime()-5;
            }

            if(point.getTime()+5>videoLength){
                end = videoLength-1;
            }else {
                end = point.getTime() + 5;
            }
            Log.d(Constants.TAG,"Clip: "+start+" : "+end);
            doShorten(start,end);   
        }
        Log.d(Constants.TAG,"Done: "+trimVideos.size());
    }

    private void doShorten(final int _startTime, final int _endTime) {
        //mProgressDialog = Ut.ShowWaitDialog(mCxt, 0);

        //mProgressDialog.show();


        if(mThreadExecutor == null)
            mThreadExecutor = Executors.newSingleThreadExecutor(new SimpleThreadFactory("doShorten"));

        //this.mThreadExecutor.execute(new Runnable() {
        //  public void run() {
                try {
                    File folder = Ut.getTestMp4ParserVideosDir(mCxt);
                    //File folder = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM),"HFVideos"+File.separator+"TEMP");
                    //Log.d(Constants.TAG, folder.toString());
                    if (!folder.exists()) {
                        Log.d(TAG, "failed to create directory");
                    }

                    //Movie movie = new MovieCreator().build(new RandomAccessFile("/home/sannies/suckerpunch-distantplanet_h1080p/suckerpunch-distantplanet_h1080p.mov", "r").getChannel());
//                  Movie movie = MovieCreator.build(new FileInputStream("/home/sannies/CSI.S13E02.HDTV.x264-LOL.mp4").getChannel());
                    Movie movie = MovieCreator.build(new FileInputStream(new File(filePath)).getChannel());
                    //Log.d(Constants.TAG,"Movie: "+movie.toString());
                    List<Track> tracks = movie.getTracks();
                    movie.setTracks(new LinkedList<Track>());
                    // remove all tracks we will create new tracks from the old

                    double startTime = _startTime;
                    double endTime = _endTime;//(double) getDuration(tracks.get(0)) / tracks.get(0).getTrackMetaData().getTimescale();

                    boolean timeCorrected = false;

                    // Here we try to find a track that has sync samples. Since we can only start decoding
                    // at such a sample we SHOULD make sure that the start of the new fragment is exactly
                    // such a frame
                    for (Track track : tracks) {
                        if (track.getSyncSamples() != null && track.getSyncSamples().length > 0) {
                            if (timeCorrected) {
                                // This exception here could be a false positive in case we have multiple tracks
                                // with sync samples at exactly the same positions. E.g. a single movie containing
                                // multiple qualities of the same video (Microsoft Smooth Streaming file)

                                throw new RuntimeException("The startTime has already been corrected by another track with SyncSample. Not Supported.");
                            }
                            startTime = correctTimeToSyncSample(track, startTime, false);
                            endTime = correctTimeToSyncSample(track, endTime, true);
                            timeCorrected = true;
                        }
                    }

                    for (Track track : tracks) {
                        long currentSample = 0;
                        double currentTime = 0;
                        long startSample = -1;
                        long endSample = -1;

                        for (int i = 0; i < track.getDecodingTimeEntries().size(); i++) {
                            TimeToSampleBox.Entry entry = track.getDecodingTimeEntries().get(i);
                            for (int j = 0; j < entry.getCount(); j++) {
                                // entry.getDelta() is the amount of time the current sample covers.

                                if (currentTime <= startTime) {
                                    // current sample is still before the new starttime
                                    startSample = currentSample;
                                }
                                if (currentTime <= endTime) {
                                    // current sample is after the new start time and still before the new endtime
                                    endSample = currentSample;
                                } else {
                                    // current sample is after the end of the cropped video
                                    break;
                                }
                                currentTime += (double) entry.getDelta() / (double) track.getTrackMetaData().getTimescale();
                                currentSample++;
                            }
                        }
                        movie.addTrack(new CroppedTrack(track, startSample, endSample));
                    }
                    long start1 = System.currentTimeMillis();
                    IsoFile out = new DefaultMp4Builder().build(movie);
                    long start2 = System.currentTimeMillis();

//                  FileOutputStream fos = new FileOutputStream(String.format("output-%f-%f.mp4", startTime, endTime));
                    String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
                    String filename = folder.getPath() + File.separator + String.format("TMP4_APP_OUT-%f-%f", startTime, endTime) + "_" + timeStamp + ".mp4";
                    trimVideos.add(filename);
                    FileOutputStream fos = new FileOutputStream(filename);

                    FileChannel fc = fos.getChannel();
                    out.getBox(fc);
                    fc.close();
                    fos.close();
                    long start3 = System.currentTimeMillis();
                    System.err.println("Building IsoFile took : " + (start2 - start1) + "ms");
                    System.err.println("Writing IsoFile took  : " + (start3 - start2) + "ms");
                    System.err.println("Writing IsoFile speed : " + (new File(String.format("TMP4_APP_OUT-%f-%f", startTime, endTime)).length() / (start3 - start2) / 1000) + "MB/s");

                    Message.obtain(mHandler, R.id.shorten, 1, 0, filename).sendToTarget();
                } catch (FileNotFoundException e) {
                    Message.obtain(mHandler, R.id.shorten, 0, 0, e.getMessage()).sendToTarget();
                    e.printStackTrace();
                } catch (IOException e) {
                    Message.obtain(mHandler, R.id.shorten, 0, 0, e.getMessage()).sendToTarget();
                    e.printStackTrace();
                }
                //mProgressDialog.dismiss();
        //  }
        //});

    }

    protected static long getDuration(Track track) {
        long duration = 0;
        for (TimeToSampleBox.Entry entry : track.getDecodingTimeEntries()) {
            duration += entry.getCount() * entry.getDelta();
        }
        return duration;
    }

    private static double correctTimeToSyncSample(Track track, double cutHere, boolean next) {
        double[] timeOfSyncSamples = new double[track.getSyncSamples().length];
        long currentSample = 0;
        double currentTime = 0;
        for (int i = 0; i < track.getDecodingTimeEntries().size(); i++) {
            TimeToSampleBox.Entry entry = track.getDecodingTimeEntries().get(i);
            for (int j = 0; j < entry.getCount(); j++) {
                if (Arrays.binarySearch(track.getSyncSamples(), currentSample + 1) >= 0) {
                    // samples always start with 1 but we start with zero therefore +1
                    timeOfSyncSamples[Arrays.binarySearch(track.getSyncSamples(), currentSample + 1)] = currentTime;
                }
                currentTime += (double) entry.getDelta() / (double) track.getTrackMetaData().getTimescale();
                currentSample++;
            }
        }
        double previous = 0;
        for (double timeOfSyncSample : timeOfSyncSamples) {
            if (timeOfSyncSample > cutHere) {
                if (next) {
                    return timeOfSyncSample;
                } else {
                    return previous;
                }
            }
            previous = timeOfSyncSample;
        }
        return timeOfSyncSamples[timeOfSyncSamples.length - 1];
    }


}

Source: (StackOverflow)

Concatenate mp4 files in Android using halfninja ffmpeg

I've manage to compile halfninja ffmpeg scripts for Android NDK using NDK version r5c. (Unfortunately any attempt to compile with earlier NDK generated some error), also I'm not very knowledgeable on the whole NDK process, so it's a bit hit-n-miss for me.

His scripts are compiling ffmpeg version N-30996-gf925b24 (the specific commit he did the scripts for)

Moving forward to my actual app. I manage to trim videos without problems, now I need to join/concatenate them but any attemp at using any and several combinations of the commands found on those 3 links (link1, link2, link3) generate errors such as cat is not valid, > is undefinined, unknown option filter_complex or trying to override some of the input files.

Does anyone know if it's possible and (how to do it), to join/concatenate mp4 videos (all same codec, size, quality, etc) using half-ninja compile of ffmpeg on Android, or how to compile/get a ffmpeg for Android using latest source codes?

I've also gave a quick try on the mp4Parser without much success.

ultimately I was trying to get this pseudo-method to work:

public static File concatenate(String[] inputPaths, String outputPath){

    // ... do stuff do generate ffmpeg commands....
    VideoKit v = new VideoKit();
    v.run(cmds);

    File f = new File(outputPath);
    return f;
}

Source: (StackOverflow)

mp4 tag editing with java

I want to edit tags of mp4 video file in java. I find out mp4parser on google code but there is no enough documentation for that. What would be the best lib for editing mp4 video tags in java. And is there any limitation for comment tag in mp4 video??


Source: (StackOverflow)

Appending videos doesn't work properly - Android

I am using Sebastian Annies example for the mp4parser where I append 3 videos. The result should be one video that plays all the three videos simultaneously. However, I get one video that plays the last video three times. Here is my code...

            //    int i = number of videos....
      try {         
        String[] f = new String[i];

        for (int count = 0; count < i; count++) {
            f[count] = "/sdcard/vid" + i + ".mp4";
        }

        Movie[] inMovies = new Movie[i];

        for (int count = 0; count < i; count++) {
            inMovies[count] = MovieCreator.build(f[count]);
        }

        List<Track> videoTracks = new LinkedList<Track>();
        List<Track> audioTracks = new LinkedList<Track>();

        for (Movie m : inMovies) {
            for (Track t : m.getTracks()) {
                if (t.getHandler().equals("soun")) {
                    audioTracks.add(t);
                }
                if (t.getHandler().equals("vide")) {
                    videoTracks.add(t);
                }
            }
        }

        Movie result = new Movie();

        if (audioTracks.size() > 0) {
            result.addTrack(new AppendTrack(audioTracks.toArray(new Track[audioTracks.size()])));
        }
        if (videoTracks.size() > 0) {
            result.addTrack(new AppendTrack(videoTracks.toArray(new Track[videoTracks.size()])));
        }

        Container out = new DefaultMp4Builder().build(result);

        FileChannel fc = new RandomAccessFile(String.format
                ("/sdcard/output.mp4"),
                "rw").getChannel();
        out.writeContainer(fc);
        fc.close();
    } catch (FileNotFoundException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }

    VideoView v = (VideoView) findViewById(R.id.videoView1);
    //      v.setVideoPath("/sdcard/aaa" + i + ".mp4");
    v.setVideoPath("/sdcard/output.mp4");
    v.setMediaController(new MediaController(this));
    v.start();

I dont know why it isn't doing what it's supposed to do. Please help me. Thanks


Source: (StackOverflow)

Rotate video with mp4parser according to the camera used

Basically I would like to rotate video, depending on which camera was used to take it. Front camera is mirrored with 90 degrees, whereas back camera is displayed properly. Currently it will set Matrix according to the first clip. If the first clip was made with front camera, it will rotate all clips to 270 degrees and vice a versa. rotations is an arrayList which contains clip rotations.

for (TrackBox trackBox : trackBoxes) {
       Log.d("TRACKBOX", String.valueOf(rotations.get(i)));
       //trackBox.getTrackHeaderBox().setMatrix(Matrix.ROTATE_90);
       if(rotations.get(i) == 90){ //if clip was made with back camera
              trackBox.getTrackHeaderBox().setMatrix(Matrix.ROTATE_90);
              Log.d("Rotating to:", "90 degrees");

       } else if (rotations.get(i) == 270){ //if clip was made with front camera
              trackBox.getTrackHeaderBox().setMatrix(Matrix.ROTATE_270);
              Log.d("Rotating to:", "270 degrees");
       }
       m.addTrack(new Mp4TrackImpl(trackBox));

}
inMovies[i] = m;

Source: (StackOverflow)

How to merge audio and video file in android

Thanks for the great mp4parser lib, i have few queries related to audio video muxing.

We used the below code in android and tried, but we are not getting the expected output, We have kept a working mp4 file in the specific directory and trying but no luck.

Here we get the merged audio and video, but audio gets appended to video. And the appended audio will not play but simply it increases the width of video.

Any help from geeks.

Here is the code,

    File sdCard = Environment.getDataDirectory();

    String videofilepath = Environment.getExternalStorageDirectory().toString()+"/video.mp4";
    String audiofilepath = Environment.getExternalStorageDirectory().toString()+"/audio.aac";
    File file=new File(videofilepath);

    H264TrackImpl h264Track = new H264TrackImpl(new FileDataSourceImpl(videofilepath));
    AACTrackImpl aacTrack = new AACTrackImpl(new FileDataSourceImpl(audiofilepath));

    Movie movie = new Movie();
    movie.addTrack(h264Track);
    movie.addTrack(aacTrack);


    Container mp4file = new DefaultMp4Builder().build(movie);

    FileChannel fc = new FileOutputStream(new File(Environment.getExternalStorageDirectory().toString() + "/video.mp4")).getChannel();
    mp4file.writeContainer(fc);
    fc.close();

Source: (StackOverflow)

is mp4parser java library has android version

I'm trying to use mp4pasrser library in android and I encountered some difficulties. I'm guessing it is becuase mp4parser is a java based project and not adapted yet to android.

PS - mp4parser is wordly spread and even being used by Instagram

I'm basing my conclusions on the following pice of code:

 public PropertyBoxParserImpl(String... customProperties) {
    InputStream is = getClass().getResourceAsStream("/assets/isoparser-default.properties");
    mapping = new Properties();
    try {
        mapping.load(is);
        Enumeration<URL> enumeration = Thread.currentThread().getContextClassLoader().getResources("isoparser-custom.properties");

        while (enumeration.hasMoreElements()) {
            URL url = enumeration.nextElement();
            mapping.load(url.openStream());
        }
        for (String customProperty : customProperties) {
            mapping.load(getClass().getResourceAsStream(customProperty));
        }
    } catch (IOException e) {
        throw new RuntimeException(e);
    }
}

isoparser-default.properties is a reflaction mapping:

hint=com.coremedia.iso.boxes.TrackReferenceTypeBox(type)
cdsc=com.coremedia.iso.boxes.TrackReferenceTypeBox(type)
meta-ilst=com.coremedia.iso.boxes.apple.AppleItemListBox()
-----name=com.coremedia.iso.boxes.apple.AppleNameBox()
-----mean=com.coremedia.iso.boxes.apple.AppleMeanBox()
-----data=com.coremedia.iso.boxes.apple.AppleDataBox()
rmra=com.coremedia.iso.boxes.apple.AppleReferenceMovieBox()
rmda=com.coremedia.iso.boxes.apple.AppleReferenceMovieDescriptorBox()
rmdr=com.coremedia.iso.boxes.apple.AppleDataRateBox()
rdrf=com.coremedia.iso.boxes.apple.AppleDataReferenceBox()

in android you cant initiate a url with a link to a local class and mehod and load it like the above PropertyBoxParserImpl() does.

So I'm guessing that I'll just need to change the following function, does anyone enocountered those issues while using the mp4parser?


Source: (StackOverflow)

android mp4parser video appending result of 00:00 duration

hy everyone i am trying to append two videos in android using mp4parser, but the problem is that the resultant video is of 00:00 duration although it occupies the size of video1 + video2 both the videos are captured from the same back cam of same handset with exactly same encoding so encoding isn't any problem. i've already had a look at the samples referred by the mp4parser official site and it's almost the same code but there seems to be something else thanks in advance here is my code

private void appendVideos() throws IOException
    {
        String f1 = Environment.getExternalStorageDirectory() + "/video1.mp4";
        String f2 = Environment.getExternalStorageDirectory()+ "/video2.mp4";

        Movie[] inMovies;

        inMovies = new Movie[]
                {
                MovieCreator.build(f1),
                MovieCreator.build(f2),
                };


        List<Track> videoTracks = new LinkedList<Track>();
        List<Track> audioTracks = new LinkedList<Track>();

        for (Movie m : inMovies) 
        {
            for (Track t : m.getTracks()) 
            {
                if (t.getHandler().equals("soun")) 
                {
                    audioTracks.add(t);
                }
                if (t.getHandler().equals("vide")) 
                {
                    videoTracks.add(t);
                }
            }
        }

        Movie result = new Movie();

        if (audioTracks.size() > 0) 
        {
            result.addTrack(new AppendTrack(audioTracks.toArray(new Track[audioTracks.size()])));
        }
        if (videoTracks.size() > 0) 
        {
            result.addTrack(new AppendTrack(videoTracks.toArray(new Track[videoTracks.size()])));
        }

        Container out = new DefaultMp4Builder().build(result);

        RandomAccessFile ram = new RandomAccessFile(String.format(Environment.getExternalStorageDirectory() + "/output.mp4"), "rw");
        FileChannel fc = ram.getChannel();
        out.writeContainer(fc);
        ram.close();
        fc.close();
        Toast.makeText(getApplicationContext(), "success", Toast.LENGTH_SHORT).show();
    }

Source: (StackOverflow)