AudioToolbox introduction

Hi. I'm new to Swift development and curious about audio recording in MacOS. Unfortunately, I'm unsuccessful so far trying to record anything. There are several approaches (AVFoundation, AudioToolbox) to choose from and it's difficult for me to decide which one is best for my purpose.

I want to record voice through the internal microphone. That would be great already for a start. I'd also like to record instruments connected through external audio devices and do some processing like displaying the waveform and the pitch. The documentation I've read so far says the System Preferences for the audio devices determines which ones are used. However, the audio tools I've used (including Garage Band) all come with their own I/O settings.

From the documentation alone I couldn't manage to create any audio recording for MacOS. There seem to be a few more tutorials for iOS, but I want to build a MacOS app. I was wondering whether there is a book or a course or anything anyone could recommend to learn about audio capturing. Or if anyone is willing to give me a working code snippet to start with.

1 Like

Audio has a very very steep learning curve unfortunately. I think a good starting point can be a framework like this one: AudioKit. They have great tutorials too. It's a nice abstraction layer where you can create and connect nodes that do the job for you. (No abstraction is a perfect abstraction though, you may or may not end up ditching the framework and diving into CoreAudio but optimistically AudioKit might solve your problems.)

1 Like

Thank you. I hadn't come across AudioKit. It looks really promising, I'll give it a try.

Terms of Service

Privacy Policy

Cookie Policy