Saving Audio After Effect in iOS

Saving Audio After Effect in iOS

Here it is my solution to question :

func playAndRecord(pitch : Float, rate: Float, reverb: Float, echo: Float) {
// Initialize variables

// These are global variables . if you want you can just (let audioEngine = etc ..) init here these variables
audioEngine = AVAudioEngine()
audioPlayerNode = AVAudioPlayerNode()
audioEngine.attachNode(audioPlayerNode)
playerB = AVAudioPlayerNode()

audioEngine.attachNode(playerB)

// Setting the pitch
let pitchEffect = AVAudioUnitTimePitch()
pitchEffect.pitch = pitch
audioEngine.attachNode(pitchEffect)

// Setting the platback-rate
let playbackRateEffect = AVAudioUnitVarispeed()
playbackRateEffect.rate = rate
audioEngine.attachNode(playbackRateEffect)

// Setting the reverb effect
let reverbEffect = AVAudioUnitReverb()
reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.Cathedral)
reverbEffect.wetDryMix = reverb
audioEngine.attachNode(reverbEffect)

// Setting the echo effect on a specific interval
let echoEffect = AVAudioUnitDelay()
echoEffect.delayTime = NSTimeInterval(echo)
audioEngine.attachNode(echoEffect)

// Chain all these up, ending with the output
audioEngine.connect(audioPlayerNode, to: playbackRateEffect, format: nil)
audioEngine.connect(playbackRateEffect, to: pitchEffect, format: nil)
audioEngine.connect(pitchEffect, to: reverbEffect, format: nil)
audioEngine.connect(reverbEffect, to: echoEffect, format: nil)
audioEngine.connect(echoEffect, to: audioEngine.mainMixerNode, format: nil)

// Good practice to stop before starting
audioPlayerNode.stop()

// Play the audio file
// this player is also a global variable AvAudioPlayer
if(player != nil){
player?.stop()
}

// audioFile here is our original audio
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: {
print("Complete")
})

try! audioEngine.start()

let dirPaths: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)[0]
let tmpFileUrl: NSURL = NSURL.fileURLWithPath(dirPaths.stringByAppendingPathComponent("effectedSound2.m4a"))

//Save the tmpFileUrl into global varibale to not lose it (not important if you want to do something else)
filteredOutputURL = tmpFileUrl

do{
print(dirPaths)

self.newAudio = try! AVAudioFile(forWriting: tmpFileUrl, settings: [
AVFormatIDKey: NSNumber(unsignedInt:kAudioFormatAppleLossless),
AVEncoderAudioQualityKey : AVAudioQuality.Low.rawValue,
AVEncoderBitRateKey : 320000,
AVNumberOfChannelsKey: 2,
AVSampleRateKey : 44100.0
])

let length = self.audioFile.length

audioEngine.mainMixerNode.installTapOnBus(0, bufferSize: 1024, format: self.audioEngine.mainMixerNode.inputFormatForBus(0)) {
(buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in

print(self.newAudio.length)
print("=====================")
print(length)
print("**************************")

if (self.newAudio.length) < length {//Let us know when to stop saving the file, otherwise saving infinitely

do{
//print(buffer)
try self.newAudio.writeFromBuffer(buffer)
}catch _{
print("Problem Writing Buffer")
}
}else{
self.audioEngine.mainMixerNode.removeTapOnBus(0)//if we dont remove it, will keep on tapping infinitely

//DO WHAT YOU WANT TO DO HERE WITH EFFECTED AUDIO

}

}
}catch _{
print("Problem")
}

audioPlayerNode.play()

}

save the audio file in the background

Actually we made mistake in the settings of Output Audio File. The output audio file processing format should be same like as the input file(which u put effect or pitch).

And the Output file format should be in the wav or caf format. This format only save to the output audio file.

 - (IBAction)save_it_after_changes:(id)sender
{

engine = [[AVAudioEngine alloc] init];
audio_player_node= [[AVAudioPlayerNode alloc] init];
[engine attachNode:audio_player_node];
[self setupEQ];

AVAudioMixerNode *mixerNode = [engine mainMixerNode];
[engine connect:audio_player_node to:unitEq format:audioFile.processingFormat];
[engine connect:unitEq to:mixerNode format:audioFile.processingFormat];

NSError *error12;
[engine startAndReturnError:&error12];
if (!error12)
{
NSLog(@"Engine = %@",engine);
[audio_player_node scheduleFile:audioFile atTime:nil completionHandler:nil];
NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];

[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];

[recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];

NSError *error;
outputFile = [[AVAudioFile alloc] initForWriting:[self testFilePathURL] settings:recordSetting error:&error];
NSLog(@"outputfile = %@",outputFile);
if (error)
{
NSLog(@"outputFile error = %@",error);
}
else
{ //(AVAudioFrameCount)audioFile.length
[audio_player_node installTapOnBus:0 bufferSize:8192 format:audioFile.processingFormat block:^(AVAudioPCMBuffer *buffer, AVAudioTime *when) {
NSLog(@"Buffer Size = %@",buffer);
NSLog(@"when = %lld",when.sampleTime);
NSLog(@"outputfile length = %lli",outputFile.length);
NSLog(@"input file length = %lld",audioFile.length);
if (outputFile.length<audioFile.length)
{
NSError *error;
[outputFile writeFromBuffer:buffer error:&error];
if (error)
{
NSLog(@"writebuffererror =%@",error);
}
}
else
{
[audio_player_node removeTapOnBus:0];
NSError *error2;
// player2 = [[AVAudioPlayer alloc] initWithContentsOfURL:[self testFilePathURL] error:&error2];
//player2.delegate = self;
NSLog(@"Pathththt = %@",[self testFilePathURL]);
NSLog(@"error = %@",error2);
// [audio_player_node scheduleFile:outputFile atTime:nil completionHandler:nil];
//[audio_player_node play];
// [self toMp3];
}

}];
}
}
else
{
NSLog(@"error12 =%@",error12);
}

}

- (void)setupEQ
{
NSLog(@"setupEQ");

unitEq = [[AVAudioUnitEQ alloc] initWithNumberOfBands:12];
unitEq.globalGain = 3.0;
AVAudioUnitEQFilterParameters *filterParameters;
filterParameters = unitEq.bands[0];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 31;
filterParameters.bandwidth = 1.0;
filterParameters.gain = -20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[1];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 63;
filterParameters.bandwidth = 1.0;
filterParameters.gain = -20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[2];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 125;
filterParameters.bandwidth = 1.0;
filterParameters.gain = -20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[3];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 250;
filterParameters.bandwidth = 1.0;
filterParameters.gain = -20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[4];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 500;
filterParameters.bandwidth = 1.0;
filterParameters.gain = -20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[5];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 1000;
filterParameters.bandwidth = 1.0;
filterParameters.gain = -20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[6];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 2000;
filterParameters.bandwidth = 1.0;
filterParameters.gain = -20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[7];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 4000;
filterParameters.bandwidth = 1.0;
filterParameters.gain =-20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[8];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 8000;
filterParameters.bandwidth = 1.0;
filterParameters.gain = -20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[9];
filterParameters.filterType = AVAudioUnitEQFilterTypeParametric;
filterParameters.frequency = 16000;
filterParameters.bandwidth = 1.0;
filterParameters.gain =-20;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[10];
filterParameters.filterType = AVAudioUnitEQFilterTypeLowPass;
filterParameters.frequency = 16857;
filterParameters.bypass = FALSE;

filterParameters = unitEq.bands[11];
filterParameters.filterType = AVAudioUnitEQFilterTypeHighPass;
filterParameters.frequency = 205;
filterParameters.bypass = FALSE;
[engine attachNode:unitEq];
}

- (NSString *)applicationDocumentsDirectory
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *basePath = ([paths count] > 0) ? [paths objectAtIndex:0] : nil;
return basePath;
}

//------------------------------------------------------------------------------

- (NSURL *)testFilePathURL
{
return [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@/test.caf",
[self applicationDocumentsDirectory]]];
}

Please play the file after successfully saved. It works for me. check it out.

Please refer below link, I get more from here, Can I use AVAudioEngine to read from a file, process with an audio unit and write to a file, faster than real-time?

refer this sample project. It is what we are looking for
https://github.com/VladimirKravchenko/AVAudioEngineOfflineRender

Record audio with added effects

You need to flush and close the file audio file, so that the caf file is properly written out.

Seeing AVAudioFile doesn't have explicit methods for doing that, your only hope appears to be setting newAudio to nil after you've finished and hoping that it is done during AVAudioFile's dealloc:

self.audioEngine.mainMixerNode.removeTapOnBus(0)
print("finish?")
self.newAudio = nil // hopefully flush & close, if there are no other strong references

Record audio, add effects, then save result to a audio file

Using FMOD_OUTPUTTYPE_WAVWRITER is fairly straight forward, you set the type via System::setOutput, specify the output file via System::init extradriverdata. The extradriverdata should be an absolute path to a writable area of the device such as the documents directory. After you have finished playing, call System::release and the file will be complete.

The other option for recording wave data with effects is by creating a custom DSP and connecting it to the channel playing the recorded data. You will then get regular callbacks giving you float data that you must write out to disk yourself. You can find examples of DSPs and writing wav files in the dsp_custom and recordtodisk examples respectively.

Finally note that FMOD doesn't come with the facility to write compressed audio to disk, you will need another API to achieve this goal.

Saving Recorded Audio (Swift)

I think the file is probably still there when the app quits, the problem is that your viewDidLoad() method immediately calls setupRecorder(), which in turn creates a new AVAudioRecorder using exactly the same filename as last time – overwriting your work.

To help you re-arrange your code, go to audioRecorderDidFinishRecording() and change print(recordedAudio.title) to print(recorder.url). If you're running in the iOS Simulator that will give you a long path to an exact filename on your OS X disk drive.

If you browse there using Finder you'll be able to see your "audioFile.m4a" file being created and overwritten again and again, which will let you see exactly when your problem occurs. If you want to see the exact problem, set a breakpoint in your code when you call prepareToPlay(), check your file's size, then press F6 to execute that line of code, then check your file's size again – you should see it being cleared :)

The solution is probably to generate a new filename every time. You could use NSUUID for that if you wanted:

let filename = NSUUID().UUIDString + ".m4a"

You might find my tutorial on AVAudioRecorder useful.

Note: your getCacheDirectory() method is poorly named. It's fetching the documents directory, not the caches directory, which is a bit of a red herring when trying to debug issues like this.



Related Topics



Leave a reply



Submit