iOS Swift Convert Mp3 to Aac

iOS swift convert mp3 to aac

update

You're creating a caf file instead of an m4a.

Replace AVFileTypeCoreAudioFormat with AVFileTypeAppleM4A in

AVAssetWriter(URL: self.outputURL, fileType: AVFileTypeCoreAudioFormat)

Call self.assetWriter.finishWritingWithCompletionHandler() when you've finished.

ios - Convert video's audio to AAC

You should be able to achieve this by configuring your AVAssetReaderOutput output settings:

NSDictionary *readerOutputSettings = @{ AVSampleRateKey: @44100, AVFormatIDKey: @(kAudioFormatLinearPCM) };

AVAssetReaderOutput *assetReaderOutput =[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoAudioTrack
outputSettings:readerOutputSettings];

iOS split stereo mp3 to mono aac

We finally got this to work! Here is the final swift code that we are using to convert nsdata to a samplebuffer:

func NSDataToSample(data:NSData) -> CMSampleBufferRef? {

var cmBlockBufferRef:CMBlockBufferRef?

var status = CMBlockBufferCreateWithMemoryBlock(nil, nil, data.length, nil, nil, 0, data.length, 0, &cmBlockBufferRef)

if(status != 0) {
return nil
}

status = CMBlockBufferReplaceDataBytes(data.bytes, cmBlockBufferRef!, 0, data.length)

if(status != 0) {
return nil
}

var audioFormat:AudioStreamBasicDescription = AudioStreamBasicDescription()

audioFormat.mSampleRate = 44100
audioFormat.mFormatID = kAudioFormatLinearPCM
audioFormat.mFormatFlags = 0xc
audioFormat.mBytesPerPacket = 2
audioFormat.mFramesPerPacket = 1
audioFormat.mBytesPerFrame = 2
audioFormat.mChannelsPerFrame = 1
audioFormat.mBitsPerChannel = 16
audioFormat.mReserved = 0

var format:CMFormatDescriptionRef?

status = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, &audioFormat, 0, nil, 0, nil, nil, &format)

if(status != 0) {
return nil
}

var sampleBuffer:CMSampleBufferRef?

status = CMSampleBufferCreate(kCFAllocatorDefault, cmBlockBufferRef!, true, nil, nil, format, data.length/2, 0, nil, 0, nil, &sampleBuffer)

if(status != 0) {
return nil
}

return sampleBuffer
}

How do I convert audio format from mp3 to wav in iOS using Swift?

WAV and contain PCM, look at Core Audio, specifically:

  1. Audio File Services (to read the MP3 format and write AIFF or WAV)
  2. Audio File Conversion Services (to convert the MP3 data to PCM,
    and/or to encode from PCM to some other codec if you were to write a
    file).Note that a given converter cannot convert between two encoded formats. you can do MP3-to-PCM or PCM-to-AAC, but to do MP3-to-AAC, you'd need two converters.
  3. Extended Audio File Services, which combine both of the above.

Also, be sure to understand the difference between codecs and file formats, and what codec/format combinations are legal. I was surprised the first time I found out that PCM must be little-endian in a WAV, big-endian in an AIFF.

Look at the nice example here taken from v4 AudioKit

iOS Code to Convert m4a to WAV

If anyone else needs some code to do this here it is in Swift

func convertAudioFile(sourceURL: CFURLRef, destinationURL: 
CFURLRef, outputFormat: OSType ,
outputSampleRate: Float64) -> OSStatus
{
var error : OSStatus = noErr
var destinationFile : ExtAudioFileRef = nil
var sourceFile : ExtAudioFileRef = nil

var srcFormat : AudioStreamBasicDescription = AudioStreamBasicDescription()
var dstFormat : AudioStreamBasicDescription = AudioStreamBasicDescription()

var audioConverter : AudioConverterRef = nil

audioConverter = AudioConverterRef.init()

ExtAudioFileOpenURL(sourceURL, &sourceFile)

var thePropertySize: UInt32 = UInt32(strideofValue(srcFormat))

ExtAudioFileGetProperty(sourceFile, kExtAudioFileProperty_FileDataFormat, &thePropertySize, &srcFormat)

dstFormat.mSampleRate = (outputSampleRate == 0 ? srcFormat.mSampleRate : outputSampleRate) //Set sample rate

dstFormat.mFormatID = outputFormat
dstFormat.mChannelsPerFrame = 1
dstFormat.mBitsPerChannel = 16
dstFormat.mBytesPerPacket = 2 * dstFormat.mChannelsPerFrame
dstFormat.mBytesPerFrame = 2 * dstFormat.mChannelsPerFrame
dstFormat.mFramesPerPacket = 1
dstFormat.mFormatFlags = kLinearPCMFormatFlagIsPacked | kLinearPCMFormatFlagIsSignedInteger // little-endian

//Create destination file
ExtAudioFileCreateWithURL(destinationURL, kAudioFileCAFType, &dstFormat, nil,
AudioFileFlags.EraseFile.rawValue, &destinationFile)

ExtAudioFileSetProperty(sourceFile, kExtAudioFileProperty_ClientDataFormat, thePropertySize, &dstFormat)
ExtAudioFileSetProperty(destinationFile, kExtAudioFileProperty_ClientDataFormat, thePropertySize, &dstFormat)

var size : UInt32 = UInt32(strideofValue(audioConverter))

ExtAudioFileGetProperty(destinationFile, kExtAudioFileProperty_AudioConverter, &size, &audioConverter)

var canResume : UInt32 = 0

size = UInt32(strideofValue(canResume))

error = AudioConverterGetProperty(audioConverter, kAudioConverterPropertyCanResumeFromInterruption, &size, &canResume)

let bufferByteSize : UInt32 = 32768
var srcBuffer = [UInt8](count: 32768, repeatedValue: 0)

var sourceFrameOffset : ULONG = 0

print("Converting audio file")

while(true){

var fillBufList = AudioBufferList(
mNumberBuffers: 1,
mBuffers: AudioBuffer(
mNumberChannels: 2,
mDataByteSize: UInt32(srcBuffer.count),
mData: &srcBuffer
)
)

var numFrames : UInt32 = 0

if(dstFormat.mBytesPerFrame > 0){
numFrames = bufferByteSize / dstFormat.mBytesPerFrame
}

ExtAudioFileRead(sourceFile, &numFrames, &fillBufList)

if(numFrames == 0){
error = noErr;
break;
}

sourceFrameOffset += numFrames

error = ExtAudioFileWrite(destinationFile, numFrames, &fillBufList)
}

ExtAudioFileDispose(destinationFile)
ExtAudioFileDispose(sourceFile)

let audioAsset = AVURLAsset.init(URL: destinationURL, options: nil)
if(audioAsset.duration.seconds < 5.0){
error = -2500
}

return error;


Related Topics



Leave a reply



Submit