iOS Swift将mp3转换为AAC [英] iOS swift convert mp3 to aac

查看:171
本文介绍了iOS Swift将mp3转换为AAC的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用基于当我生成PCM文件时它可以工作.当我将导出格式更改为m4a时,它将生成一个文件,但无法播放.为什么它会损坏?

It works when I generate a PCM file. When I change the export format to m4a it generates a file but it won't play. Why is it corrupt?

这是到目前为止的代码:

Here is the code so far:

import AVFoundation
import UIKit

class ViewController: UIViewController {

    var rwAudioSerializationQueue:dispatch_queue_t!

    var asset:AVAsset!

    var assetReader:AVAssetReader!

    var assetReaderAudioOutput:AVAssetReaderTrackOutput!

    var assetWriter:AVAssetWriter!

    var assetWriterAudioInput:AVAssetWriterInput!

    var outputURL:NSURL!

    override func viewDidLoad() {
        super.viewDidLoad()

        let rwAudioSerializationQueueDescription = String(self) + " rw audio serialization queue"

        // Create the serialization queue to use for reading and writing the audio data.
        self.rwAudioSerializationQueue = dispatch_queue_create(rwAudioSerializationQueueDescription, nil)

        let paths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
        let documentsPath = paths[0]

        print(NSBundle.mainBundle().pathForResource("input", ofType: "mp3"))

        self.asset = AVAsset(URL: NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("input", ofType: "mp3")! ))

        self.outputURL = NSURL(fileURLWithPath: documentsPath + "/output.m4a")

        print(self.outputURL)

      //  [self.asset loadValuesAsynchronouslyForKeys:@[@"tracks"] completionHandler:^{

        self.asset.loadValuesAsynchronouslyForKeys(["tracks"], completionHandler: {

            print("loaded")

            var success = true
            var localError:NSError?

            success = (self.asset.statusOfValueForKey("tracks", error: &localError) == AVKeyValueStatus.Loaded)

            // Check for success of loading the assets tracks.
            //success = ([self.asset statusOfValueForKey:@"tracks" error:&localError] == AVKeyValueStatusLoaded);
            if (success)
            {
                // If the tracks loaded successfully, make sure that no file exists at the output path for the asset writer.

                let fm = NSFileManager.defaultManager()
                let localOutputPath = self.outputURL.path
                if (fm.fileExistsAtPath(localOutputPath!)) {
                    do {
                        try fm.removeItemAtPath(localOutputPath!)
                        success = true
                    } catch {

                    }
                }
            }
            if (success) {
                success = self.setupAssetReaderAndAssetWriter()
            }
            if (success) {
                success = self.startAssetReaderAndWriter()
            }
        })
    }

    func setupAssetReaderAndAssetWriter() -> Bool {

        do {
            try self.assetReader = AVAssetReader(asset: self.asset)
        } catch {

        }

        do {
            try self.assetWriter = AVAssetWriter(URL: self.outputURL, fileType: AVFileTypeCoreAudioFormat)
        } catch {

        }

        var assetAudioTrack:AVAssetTrack? = nil
        let audioTracks = self.asset.tracksWithMediaType(AVMediaTypeAudio)

        if (audioTracks.count > 0) {
            assetAudioTrack = audioTracks[0]
        }

        if (assetAudioTrack != nil)
        {
            let decompressionAudioSettings:[String : AnyObject] = [
                AVFormatIDKey:Int(kAudioFormatLinearPCM)
            ]

            self.assetReaderAudioOutput = AVAssetReaderTrackOutput(track: assetAudioTrack!, outputSettings: decompressionAudioSettings)

            self.assetReader.addOutput(self.assetReaderAudioOutput)

            var channelLayout = AudioChannelLayout()
            memset(&channelLayout, 0, sizeof(AudioChannelLayout));
            channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;

            /*let compressionAudioSettings:[String : AnyObject] = [
                AVFormatIDKey:Int(kAudioFormatMPEG4AAC) ,
                AVEncoderBitRateKey:128000,
                AVSampleRateKey:44100 ,
               // AVEncoderBitRatePerChannelKey:16,
               // AVEncoderAudioQualityKey:AVAudioQuality.High.rawValue,
                AVNumberOfChannelsKey:2,
                AVChannelLayoutKey: NSData(bytes:&channelLayout, length:sizeof(AudioChannelLayout))
            ]

            var outputSettings:[String : AnyObject] = [
                AVFormatIDKey: Int(kAudioFormatLinearPCM),
                AVSampleRateKey: 44100,
                AVNumberOfChannelsKey: 2,
                AVChannelLayoutKey: NSData(bytes:&channelLayout, length:sizeof(AudioChannelLayout)),
                AVLinearPCMBitDepthKey: 16,
                AVLinearPCMIsNonInterleaved: false,
                AVLinearPCMIsFloatKey: false,
                AVLinearPCMIsBigEndianKey: false
            ]*/

            let outputSettings:[String : AnyObject] = [
                AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
                AVSampleRateKey: 44100,
                AVNumberOfChannelsKey: 2,
                AVChannelLayoutKey: NSData(bytes:&channelLayout, length:sizeof(AudioChannelLayout))            ]

            self.assetWriterAudioInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: outputSettings)

            self.assetWriter.addInput(self.assetWriterAudioInput)
        }

        return true
    }

    func startAssetReaderAndWriter() -> Bool {

        self.assetWriter.startWriting()

        self.assetReader.startReading()

        self.assetWriter.startSessionAtSourceTime(kCMTimeZero)

        self.assetWriterAudioInput.requestMediaDataWhenReadyOnQueue(self.rwAudioSerializationQueue, usingBlock: {

            while (self.assetWriterAudioInput.readyForMoreMediaData ) {

                var sampleBuffer = self.assetReaderAudioOutput.copyNextSampleBuffer()

                if (sampleBuffer != nil) {
                    self.assetWriterAudioInput.appendSampleBuffer(sampleBuffer!)

                    sampleBuffer = nil

                } else {
                    self.assetWriterAudioInput.markAsFinished()
                    self.assetReader.cancelReading()
                    print("done")
                   break
                }
            }
        })

        return true
    }
}

推荐答案

更新

您正在创建caf文件而不是m4a.

You're creating a caf file instead of an m4a.

AVAssetWriter(URL: self.outputURL, fileType: AVFileTypeCoreAudioFormat)

完成后致电self.assetWriter.finishWritingWithCompletionHandler().

这篇关于iOS Swift将mp3转换为AAC的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆