How to play MIDI in swift?

As a big music lover I always like to experiment with different synths, music banks and instruments. MIDI is one of those fancy things that has existed for a long long time and probably will keep on going. It’s super easy to use across many platforms, electronic musical instruments, computers, and many other devices to communicate and synchronize with each other. It allows musicians, composers and sound engineers to control and exchange musical information such as notes, dynamics, as well as using virtual instruments to play in DAW (Digital Audio Workstation). Long term MIDI has revolutionized music production, enabling seamless integration of various instruments and technologies in the creation and performance of music. In this post I want to show how to play some simple notes using MIDI interface in iOS.

tldr; full example is available here

SoundFont

Before jumping into the code we need to get a SoundFont (Sound Bank). MIDI itself doesn’t contain any sounds – it has only instructions how to play them. You can think of it as an interface. Therefore the actual sound needs to be played using some sound bank (implementation). Each bank contains a map of sounds stored in sections similar to a musical keyboard. Some banks supports effects such as vibrato, pitch noting and many more.

You can download a free sound bank from Musical Artifacts. For this example I decided to use a Nokia Tongbao Bank 8-bit version. Download the file and drag it into your Xcode project.

Note: Sound bank won’t play in simulator, you need a real device. In a simulator you get a nice sine sound.

Create First track

To play MIDI we need to create a MusicSequence with MusicTrack and MIDINoteMessages.

MusicSequence can contain multiple MusicTracks. MusicTrack is an object where you can add multiple MIDINoteMessages at specific MusicTimeStamp. In general you can think that each MusicTrack is a separate instrument and MDINoteMessage are the notes played at a specific time. Once the sequence, tracks and notes are prepared they are passed to AVMIDIPlayer or AVAudioEngine to play it. Let’s create a class called Song which will be responsible for maintaining the tracks, tempo and notes.

// Song.swift

import AVFoundation


class Song {
    var musicSequence: MusicSequence?
    var tracks: [Int: MusicTrack] = [:]
    
    init() {
        guard NewMusicSequence(&musicSequence) == OSStatus(noErr) else {
            fatalError("Cannot create MusicSequence")
        }
    }
}

Each song should contain a single MusicSequence which is basically a definition of that song. Before we write any notes we need a track where it should be written upon. Let’s add a method to create a MusicTrack

// Song.swift
func addTrack(instrumentId: UInt8) -> Int {
    var track: MusicTrack?
    
    guard MusicSequenceNewTrack(musicSequence!, &track) == OSStatus(noErr) else {
        fatalError("Cannot add track")
    }
    
    let trackId = tracks.count
    tracks[trackId] = track
    
    var inMessage = MIDIChannelMessage(status: 0xC0, data1: instrumentId, data2: 0, reserved: 0)
    MusicTrackNewMIDIChannelEvent(track!, 0, &inMessage)
    
    return trackId
}

Each track has to be connected to a MusicSequence. Interesting part here is MIDIChannelMessage – this is a control system used to communicate with the MIDI driver. This particular code 0xC0 tells the driver to use the instrument with passed instrumentId. If you have played a musical keyboard you may remember that you can switch instruments by typing a number. In most common scenarios you could expect 0 to be Acoustic Grand Piano, 30 – Distortion Guitar and so on. Not all banks support all instruments, but you can get a list of all supported by the bank you selected using CopyInstrumentInfoFromSoundBank. If you are interesting in more MIDI events you can check official specification.

When a fresh MusicSequence is created, it already contains a tempo track. In case a tempo should be altered, that track has to be updated. The painful part is that each event from the current tempo track has to be removed. However setting a new tempo has already a method for it.

NOTE: Make sure to remove the events before applying a new tempo otherwise it may get confused.

// Song.swift
func setTempo(tempo: Float64) {
    let timeStamp = MusicTimeStamp(0)
    var tempoTrack: MusicTrack?
    MusicSequenceGetTempoTrack(musicSequence! ,&tempoTrack);
    
    removeTempoEvents(tempoTrack: tempoTrack!)
    
    MusicTrackNewExtendedTempoEvent(tempoTrack!, timeStamp, tempo)
}

private func removeTempoEvents(tempoTrack: MusicTrack){
    var tempIter: MusicEventIterator?
    NewMusicEventIterator(tempoTrack, &tempIter);
    var hasEvent: DarwinBoolean = false
    MusicEventIteratorHasCurrentEvent(tempIter!, &hasEvent)
    while (hasEvent == true) {
        var stamp = MusicTimeStamp(0)
        var type:MusicEventType = 0
        var data: UnsafeRawPointer? = nil
        var sizeData: UInt32 = 0
        
        MusicEventIteratorGetEventInfo(tempIter!, &stamp, &type, &data, &sizeData);
        if (type == kMusicEventType_ExtendedTempo){
            MusicEventIteratorDeleteEvent(tempIter!);
            MusicEventIteratorHasCurrentEvent(tempIter!, &hasEvent)
        }
        else{
            MusicEventIteratorNextEvent(tempIter!)
            MusicEventIteratorHasCurrentEvent(tempIter!, &hasEvent)
        }
    }
    DisposeMusicEventIterator(tempIter!)
}

MusicSequenceGetTempoTrack gets the current tempo track from the MusicSequence. Using iterator we can go through all events and check if any is kMusicEventType_ExtendedTempo, if so delete. When current tempo track is removed from all tempo events, call MusicTrackNewExtendedTempoEvent to set new tempo.

Add some fancy notes

Once the track is ready, it’s a time to add some notes to it.

// Song.swift
  func addNote(trackId: Int, note: UInt8, duration: Float, position: Float) {
      let time = MusicTimeStamp(position)
      
      var musicNote = MIDINoteMessage(channel: 0,
                                      note: note,
                                      velocity: 64,
                                      releaseVelocity: 0,
                                      duration: duration)
      guard MusicTrackNewMIDINoteEvent(tracks[trackId]!, time, &musicNote) == OSStatus(noErr) else {
          fatalError("Cannot add Note")
      }
      
  }

Adding notes is very straightforward. You can set the position using MusicTimeStamp and then add it to a specific track. As in music you can pass multiple notes to the same timestamp to the same track. Let’s create a class SongComposer that will act as a director for building a proper Song object.

//SongComposer.swift

class SongComposer {
    func compose() -> Song {
        let song = Song()
        let trackId = song.addTrack(instrumentId: 48)
        song.setTempo(tempo: 240)
        var currentPosition: Float = 0
        for _ in 0...1 {
            song.addNote(trackId: trackId, note: 64,
                      duration: 1.0, position: currentPosition)
            currentPosition += 1.0
            song.addNote(trackId: trackId, note: 63,
                      duration: 1.0, position: currentPosition)
            currentPosition += 1.0
        }
        song.addNote(trackId: trackId, note: 64,
                  duration: 1.0, position: currentPosition)
        currentPosition += 1.0
        song.addNote(trackId: trackId, note: 59,
                  duration: 1.0, position: currentPosition)
        currentPosition += 1.0
        song.addNote(trackId: trackId, note: 62,
                  duration: 1.0, position: currentPosition)
        currentPosition += 1.0
        song.addNote(trackId: trackId, note: 60,
                  duration: 1.0, position: currentPosition)
        currentPosition += 1.0
        
        song.addNote(trackId: trackId, note: 57, duration: 3.0, position:currentPosition)
        song.addNote(trackId: trackId, note: 45, duration: 3.0, position:currentPosition)
        
        return song
    }
}

SongComposer is quite a dummy class. It simply creates a song with one track, sets a tempo and add some notes. When you finally play those few notes, I’m pretty sure you would recognize it 😉

While duration and position seem self-explanatory, the note as a number may not. Here’s a reference table that maps MIDI note numbers to note names, keyboard numbers and frequencies.

Time to play!

We have created our song, but we have to play it somehow. To do that let’s create a class MidiPlayer

//MidiPlayer.swift
import AVFoundation


class MidiPlayer {
    var midiPlayer: AVMIDIPlayer?
    var bankURL: URL
    
    init() {
        guard let bankURL = Bundle.main.url(forResource: "Nokia_Tongbao_Bank__Series_30__8-bit", withExtension: "sf2") else {
            fatalError("\"Nokia_Tongbao_Bank__Series_30__8-bit.sf2\" file not found.")
        }
        self.bankURL = bankURL
    }
    
    func prepareSong(song: Song){
        var data: Unmanaged<CFData>?
        guard MusicSequenceFileCreateData(song.musicSequence!, MusicSequenceFileTypeID.midiType, MusicSequenceFileFlags.eraseFile, 480, &data) == OSStatus(noErr) else {
            fatalError("Cannot create music midi data")
        }
        
        if let md = data {
            let midiData = md.takeUnretainedValue() as Data
            do {
                try self.midiPlayer = AVMIDIPlayer(data: midiData, soundBankURL: self.bankURL)
            } catch let error {
                fatalError(error.localizedDescription)
            }
        }
        self.midiPlayer!.prepareToPlay()
    }
    
    func playSong() async {
        if let md = self.midiPlayer {
            md.currentPosition = 0
            await md.play()
        }
    }
}

First we load a sound bank. You can use any bank you get from SoundFont. Then we have a method prepareSong. That method’s responsibility is to create a data object that structures all the events we have in MusicSequence. Finally we read that data and create AVMIDIPlayer for it and a chosen sound bank. Last call to prepareToPlay is not required, but omitting it may provide a delay when play is called.

Note: this example uses a player to play notes from memory. If you have a MIDI file you can use AVMIDIPlayer’s constructor with contentsOf argument which can be an URL to MIDI file.

Finally playSong method seeks to the beginning of the sequence and triggers play from the AVMIDIPlayer

Now that’s it. Combine the above and run a test.

let composer = SongComposer()
let song = composer.compose()
let player = MidiPlayer()

player.prepareSong(song: song)

await player.playSong()

Summary

In this post I wanted to show how to play some MIDI notes using Swift in iOS. While the MIDI structure and eventing may seem complex at the first glance, you can see in action how in fact easy and fast it is to play some custom notes. It’s very well structured in Swift to get to specific Tracks, Events or use native MIDI eventing. Now take a time to compose some good music Music♩ ♪ ♫ ♬


Opublikowano

w

przez

Tagi: