Save the date! Android Dev Summit is coming to Sunnyvale, CA on Oct 23-24, 2019.

Native MIDI API

The Android Native MIDI API (AMidi) gives application developers the ability to send and receive MIDI data with C/C++code, integrating more closely with their C/C++ audio/control logic and minimizing the need for JNI.

AMidi lets apps send and receive MIDI with C/C++. However, you must use the Java MidiManager API for device enumeration, discovery, and connection monitoring. In particular, it is necessary to understand the MidiManager and MidiManager.DeviceCallback classes.

To use AMidi functions from your C/C++ code you will need to include AMidi/AMidi.h and link against the amidi library. These can be both be found in version 20 of the Android NDK.

Sample Code: NativeMidi

Setting up AMidi

All apps that use AMidi have the same setup and teardown steps, whether they read or write MIDI, or both:

  1. Discover MIDI hardware with the Java MidiManager class.
  2. Obtain a Java MidiDevice object corresponding to the MIDI hardware.
  3. Pass the Java MidiDevice to native code with JNI.
  4. Obtain an AMidiDevice with AMidiDevice_fromJava().
  5. Obtain an AMidiInputPort and/or AMidiOutputPort with AMidiInputPort_open() and/or AMidiOutputPort_open().
  6. Use the obtained ports to send and/or receive MIDI data.

MIDI reading apps

Receiving MIDI is an inherently asynchronous activity. As such receiving MIDI data should always be done in a background thread. AMidi does not block when reading data.

A typical example of a MIDI reading app is a "virtual synthesizer" that receives MIDI performance data that control audio synthesis. Since MIDI data can be received by the system asynchronously and at any time, it is necessary to use a thread to continuously poll the MIDI port for incoming data. For native audio generating apps using OpenSL ES or AAudio, this is mostly simply accomplished in the audio callback function.

Discover and select MIDI hardware

There is no native API to enumerate/discover MIDI peripherals. You must use the existing Java MIDI API to discover and select attached MIDI hardware. Typically a MIDI reading app will use the Java MidiManager API to get a list of connected MIDI devices, and open one or more or them:

AppMidiManager.java

class AppMidiManager {
    // System MIDI Manager
    MidiManager mMidiManager =
        (MidiManager)getSystemService(Context.MIDI_SERVICE);
    // Open a device
    MidiDeviceInfo[] devInfos = mMidiManager.getDevices();
    MidiDeviceInfo devInfo = devInfos[0];
    mMidiManager.openDevice(devInfo,
                   new OpenMidiReceiveDeviceListener(), null);

Communicate the selected MIDI device to native layer

Once the app selects a MIDI device, it must use JNI to pass the corresponding Java-based MidiDevice object to the native layer. The native layer uses this to obtain an AMidi_Device to use with the native MIDI API:

AppMidiManager.java

class AppMidiManager {
  public class OpenMidiReceiveDeviceListener
    implements MidiManager.OnDeviceOpenedListener {
      @Override
      public void onDeviceOpened(MidiDevice device) {
        appMidiManager.startReadingMidi(mReceiveDevice, 0);
      }
  }
}

AppMidiManger.java

package com.nativemidiapp;

class AppMidiManager {
    public native void startReadingMidi(MidiDevice device, int portNumber);
}

AppMidiManager.c

#include <AMidi/AMidi.h>

AMidi_Device midiDevice;
static pthread_t readThread;

static const AMidiDevice* midiDevice = AMIDI_INVALID_HANDLE;
static std::atomic<AMidiOutputPort*> midiOutputPort(AMIDI_INVALID_HANDLE);

void Java_com_nativemidiapp_AppMidiManager_startReadingMidi(
        JNIEnv* env, jobject, jobject deviceObj, jint portNumber) {
    AMidiDevice_fromJava(j_env, deviceObj, &midiDevice);

    AMidiOutputPort* outputPort;
    int32_t result =
      AMidiOutputPort_open(midiDevice, portNumber, &outputPort);

    // check for errors...

    // Start read thread
    int pthread_result =
      pthread_create(&readThread, NULL, readThreadRoutine, NULL);
    // check for errors...

}

Receive and process MIDI data

Apps that read MIDI must poll the output (receiving) port and respond when AMidiOutputPort_receive() returns a number greater than zero. For apps that generate audio, this will typically be done in the main audio generation callback (the BufferQueue callback for OpenSL ES, the AudioStream data callback in AAudio). Since AMidiOutputPort_receive() is non-blocking, there is very little performance impact on the audio generation callback.

Note that for non performance-critical or non-audio-generating MIDI reading apps (a "MIDI Scope" for example), a plain-old low-priority background thread (with appropriate sleeps) can be used to read the low-bandwidth MIDI data.

The function readThreadRoutine() from above might look like this:

void* readThreadRoutine(void * /*context*/) {
    uint8_t inDataBuffer[SIZE_DATABUFFER];
    ssize_t numMessagesReceived;
    int32_t opCode;
    size_t messageSizeInBytes;
    int64_t timestamp;
    reading = true;
    while (reading) {
        AMidiOutputPort* outputPort = midiOutputPort.load();
        numMessagesReceived = AMidiOutputPort_receive(
            mMidiOutputPort, &opCode, inDataBuffer, sizeof(inDataBuffer),
            &messageSizeInBytes, &timestamp);
        if (numMessagesReceived >= 0) {
            if (opCode == AMIDI_OPCODE_DATA) {
                // Dispatch the MIDI data….
            }
        } else {
            // some error occurred, the negative numMessages is the error code
            int32_t errorCode = numMessages;
        }
  }
}

Alternatively, if your app is using a native audio API (like OpenSL ES, or AAudio), add MIDI receive code to the audio generation callback:

void bqPlayerCallback(SLAndroidSimpleBufferQueueItf bq, void */*context*/)
{
    uint8_t inDataBuffer[SIZE_DATABUFFER];
    ssize_t numMessagesReceived;
    int32_t opCode;
    size_t messageSizeInBytes;
    int64_t timestamp;

    // Generate Audio…
    // ...

    // Read MIDI Data
    numMessagesReceived = AMidiOutputPort_receive(
            mMidiOutputPort, &opCode, inDataBuffer, sizeof(inDataBuffer),
            &messageSizeInBytes, &timestamp);
    if (numMessagesReceived >= 0) {
        if (opCode == AMIDI_OPCODE_DATA) {
            // Parse and respond to MIDI data
            // ...
        }
    }
}

Close and release AMidi

The Java app signals the native layer to release resources and shut down.

When the app exits, your code should perform these tasks:

  1. Shutdown and join the reading thread.
  2. Close any open AMidiInputPort and/or AMidiOutputPort objects with AMidiInputPort_close() and/or AMidiOutputPort_close() functions.
  3. Release the AMidiDevice with AMidiDevice_release().

AppMidiManger.java

public native void stopReadingMidi();

AppMidiManager.c

void Java_com_nativemidiapp_TBMidiManager_stopReadingMidi(
    JNIEnv*, jobject) {
    // shut down the read thread
    reading = false;

    AMidiOutputPort* outputPort =
        midiOutputPort.exchange(AMIDI_INVALID_HANDLE);
    AMidiOutputPort_close(outputPort);
    AMidiDevice_release(midiDevice);
}

The following diagram illustrates the flow of a MIDI reading app:

MIDI writing apps

A typical example of a MIDI writing app is a MIDI controller or sequencer. Since the timing of the out-going MIDI data is well understood and controlled by the app itself, the data transmission can be done in the MIDI app's main thread. However, for performance reasons (as in a sequencer) the generation and transmission of MIDI can be done in a separate thread.

Apps can send MIDI data whenever required. Note that AMidi blocks when writing data.

Open the MIDI device

void Java_com_nativemidiapp_TBMidiManager_startWritingMidi(
       JNIEnv* env, jobject, jobject midiDeviceObj, jint portNumber) {
   media_status_t status;
   status = AMidiDevice_fromJava(
     env, midiDeviceObj, &sNativeSendDevice);
   AMidiInputPort *inputPort;
   status = AMidiInputPort_open(
     sNativeSendDevice, portNumber, &inputPort);

   // store it in a global
   sMidiInputPort = inputPort;
}

Generate and transmit MIDI data

void Java_com_nativemidiapp_TBMidiManager_writeMidi(
JNIEnv* env, jobject, jbyteArray data, jint numBytes) {
   jbyte* bufferPtr = env->GetByteArrayElements(data, NULL);
   AMidiInputPort_send(sMidiInputPort, (uint8_t*)bufferPtr, numBytes);
   env->ReleaseByteArrayElements(data, bufferPtr, JNI_ABORT);
}

Close and release AMidi

When the app exits, your code should perform these tasks:

  1. Close any open AMidiInputPort and/or AMidiOutputPort objects with AMidiInputPort_close() and/or AMidiOutputPort_close() functions. This can be done in parallel with thread stopping and joining. The state of the port is managed in a thread-safe fashion.
  2. Release the AMidiDevice with AMidiDevice_release().
void Java_com_nativemidiapp_TBMidiManager_stopWritingMidi(
    JNIEnv*, jobject) {
   AMidiDevice_release(sNativeSendDevice);
   sNativeSendDevice = NULL;
}

The following diagram illustrates the flow of a MIDI writing app: