EzDevInfo.com

finch

An idiomatic Scala version of finagle-http

Is the Finch audio library for iPhone capable of doing this?

I need to:

- start / stop sounds with lengths between 0.1 and 10 seconds
- change the playback volume

I want to / would like to / would be nice to have to:

- change the playback speed
- change the playback pitch / frequency
- pause an sound and resume playing it later
- play a sound backwards

Is Finch my best friend here?


Source: (StackOverflow)

When I use Finch to play audio, can I actually do everything that OpenAL can do?

When I use Finch to play audio, can I actually do everything that OpenAL can do?


Source: (StackOverflow)

Advertisements

How can I do metering/average peak power level in OpenAL?

I'm in the process of switching from AVAudioPlayer to OpenAL using the Finch sound engine. I need to do metering, i.e. get the average peak levels. Finch sound engine does not provide this, and I'm completely new to OpenAL. How can I do this? Any examples would be really appreciated.


Source: (StackOverflow)

Can Finch play sounds from in memory NSData?

Can the Finch library play sounds from NSData sources so that things recorded within the app using AVAudioRecorder can then be played back?


Source: (StackOverflow)

How do I stop all playing files in finch?

my question is, is there a way I can stop all sound being played at the same time with one statement? For example if there 20 sounds playing, how do i stop them without having to write 20 statements?  


Source: (StackOverflow)

finished playing audio callback with finch

is there a finished playing callback with finch? similar to - audioPlayerDidFinishPlaying in the avaudioplayer stuff? looking through the code i could not find anything that referenced it.


Source: (StackOverflow)

Can I change the playback speed without affecting the pitch with Finch?

Is it possible to change the playback speed of a sound in Finch, but withouth changing the pitch?


Source: (StackOverflow)

Use finch in application delegate?

I am finalizing a ipad game and Finch seems really nice to handle game sounds ... My question is : is it possible to instantiate all the finch mechanism in the Application Delegate and to use it from the multiple game screens ? I mean prepare finch sounds in delegates and launch them from game screens ?


Source: (StackOverflow)

If I want to play the same sound 10 times per second, must I have 10 copies of that sound in memory?

I have a sound that needs to get played 10 times per second. The sound is 1 second long. So it does overlap like 10 times. However, as far as I understand the Finch sound library, I would need 10 different instances of a sound in place so that I can play it 10 times at almost the same time.

When I have just one instance, the sound would stop and play from the beginning on every iteration, but not overlap with itself.

How to do that?


Source: (StackOverflow)

Fading out sounds when using finch in iOS

I am using the Finch openAL wrapper in iOS and would like to fade out my FISound's.

Suppose I have a 30 second sound, I would like to be able to fade out the sound over 5 seconds after 15 seconds for example.

I'd like to avoid dropping down to openAL for it if possible.


Source: (StackOverflow)

Finch Audio Engine not opening default OpenAL device

I have an abstract class that initializes Finch to the global variable Finch *engine in the awakeFromNib method as follows:

engine = [[Finch alloc] init];

None of the abstract class's subclasses override the method. However, whenever I try my program, Finch prints "Finch: Could not open default OpenAL device." in the debugger. Why can't Finch get the default OpenAL device? As far as I can tell I'm doing everything as shown in their code example.


Source: (StackOverflow)

Why doesn't Finch share a single Buffer for it's polyphonic sounds?

I am doing some research and experimenting with OpenAL - specifically I am interested in techniques for polyphony - that is, playing a single sound multiple times concurrently. I came across Finch which has a feature to support this. In my own code I had created a single OpenAL Buffer per audio file and then initialized multiple OpenAL Sources with that Buffer. Finch, on the other hand, creates an OpenAL Buffer per OpenAL Source. Is there any performance-related or functional reason for this?


Source: (StackOverflow)

Finch Sound Engine Memory Leak

I want to access some sounds in different classes and can be read and change the pitch values of each sound in multiple classes. Then i use extern FISound *mySound in my application delegate methods and loaded them in my view controller. It is working like a charm but the problem is there is an always memory leaks for [FIDecoder decodeSampleAtPath:error]

Leaked Object   #   Address Size    Responsible Library Responsible Frame
FISample,1  0x76e9030   32 Bytes    Musizs  -[FIDecoder decodeSampleAtPath:error:]
NSConcreteData,1    0x76e7100   32 Bytes    Foundation  +[NSData(NSData) allocWithZone:]
NSConcreteData,1    0x737b080   32 Bytes    Foundation  +[NSData(NSData) allocWithZone:]
FISample,1  0x76e81c0   32 Bytes    Musizs  -[FIDecoder decodeSampleAtPath:error:]

Is somebody have problem like that?? Thank you.


Source: (StackOverflow)

Why is my app crashing when I try to play a sound using the Finch library for iPhone SDK?

I followed the instructions in the read me file exactly, but for some reason, in my app, every time I hit the UIButton corresponding to the code to play the sound "[soundA play]; the app just crashes without any detailed error description except for lldb. I'm using Finch because it plays the audio using OpenAL, and I need to use OpenAL for the type of app I'm making because AVAudioPlayer or System Sounds are not usable for what I'm making. Here is the code that I am using.

Main file:

#import "ViewController.h"

@interface ViewController ()

@end

@implementation ViewController

- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
soundFactory = [[FIFactory alloc] init];
engine = [soundFactory buildSoundEngine];
[engine activateAudioSessionWithCategory:AVAudioSessionCategoryPlayback];
[engine openAudioDevice];
soundA = [soundFactory loadSoundNamed:@"1.caf" maxPolyphony:16 error:NULL];
soundB = [soundFactory loadSoundNamed:@"2.caf" maxPolyphony:16 error:NULL];
soundC = [soundFactory loadSoundNamed:@"3.caf" maxPolyphony:16 error:NULL];
soundD = [soundFactory loadSoundNamed:@"4.caf" maxPolyphony:16 error:NULL];
}

- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
}

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone) {
    return (interfaceOrientation != UIInterfaceOrientationPortraitUpsideDown);
} else {
    return YES;
}
}
- (IBAction) PlaySoundA {[soundA play];}
- (IBAction) PlaySoundB {[soundB play];}
- (IBAction) PlaySoundC {[soundC play];}
- (IBAction) PlaySoundD {[soundD play];}

@end

Header file:

#import <UIKit/UIKit.h>
#import "FISoundEngine.h"
#import "FIFactory.h"
#import "FISound.h"

@interface ViewController : UIViewController {
FIFactory* soundFactory;
FISoundEngine* engine;
FISound* soundA;
FISound* soundB;
FISound* soundC;
FISound* soundD;
}

@end

Any help would be appreciated! Thanks!


Source: (StackOverflow)

Java - Adding timed closure

I am trying to make my code quit if there is no light coming through any light sensors.

import edu.cmu.ri.createlab.terk.robot.finch.Finch;


public class RunProgram {

public static Finch LeFinch = new Finch();

public static boolean endProgram = false;

private static long WaitingTime = System.currentTimeMillis();

public static void main(String args[])
{

    LightSensors lightsensor = new LightSensors();

//do {

        while(ObjectSensor.Obstacle()==false || WaitingTime < 5000)
        {

            if (lightsensor.leftsensor() == true && lightsensor.rightsensor() == true) 
            {
                Movement.forward();
            } 
            else if (lightsensor.leftsensor() == true && lightsensor.rightsensor() == false) 
            {
                Movement.left();
                System.out.println("LEFT");
            } 
            else if (lightsensor.leftsensor() == false && lightsensor.rightsensor() == true) 
            {
                Movement.right();
                System.out.println("RIGHT");
            }
            else if (lightsensor.leftsensor() == false && lightsensor.rightsensor() == false) 
            {
                Movement.stop();
            } 

        }System.out.println("Object Detected");

//  } while(endProgram == false);

}

I have tried using System.currentTimeMillis and creating a while loop that will stop running once its over 5000 milliseconds, but this does not seem to work.

This is using the finch api.

I have updated the code, I have decided to use a counter which terminates the application once it reaches 5000+

However, this value is not resetting once a light is has been shined onto the finch.

static long counterTime = 0;

while(counterTime < 5000)
        {

            if (lightsensor.leftsensor() == true && lightsensor.rightsensor() == true) 
            {
                Movement.forward();
                counterTime = 0;
            } 
            else if (lightsensor.leftsensor() == true && lightsensor.rightsensor() == false) 
            {
                Movement.left();
                System.out.println("LEFT");
                counterTime = 0;
            } 
            else if (lightsensor.leftsensor() == false && lightsensor.rightsensor() == true) 
            {
                Movement.right();
                System.out.println("RIGHT");
                counterTime = 0;
            }
            else 
            {
                Movement.stop();
                counterTime = System.currentTimeMillis() - startTime;
                System.out.println(counterTime);
            }

        }endProgram = true;

Source: (StackOverflow)