

In the preferences I've set the time format to show "Host time (nanoseconds)" Using the MIDI monitor utility its possible to inspect the exact timing of MIDI timestamps. Hopefully these provided figures for Logic X's MIDI timestamping output can serve as a guide to them to hopefully improve Logic X MIDI timing! I doubt the Logic X developers are sitting around wiring MIDI leads to audio jacks and then recording their MIDI output.
#OSX MIDI MONITOR DRIVERS#
Given that OSX, CoreMIDI, and the UM880 drivers are the same, are the difference in the timing of MIDI timestamps down to the host DAW? What is the cause of the timing difference between the two?Īre timestamps in Live 9 more “accurate: than the ones in Logic X? Is this difference in timestamps the cause of the difference, and better timing in Live 9 over Logic X? The same 8th note sequence in Logic X being displayed by “MIDI monitor” showed that interval between timestamps had a varying value… some would be 125, 126, 124, 123. For Live 9 sending 8ths the difference in time between each MIDI events timestamp had a value of “125”. Using “MIDI Monitor” utility its possible to inspect and check the accuracy of the timestamp being sent. (an interface without MIDI timestamping firmware will behave as a plain vanilla USB interface without the tight timing, regardless of the DAW sending timestamps) The timing figures for both Logic X and Live 9 can only be down to OSX CoreMIDI timestamping feature, along with firmware within the UM880 which handles timestamps, and also with what each DAW is doing to perform sending timestamps. (Logic X only sends MIDI timestamps using the external instrument plug-in - MIDI output from regular MIDI tracks goes through the environment, where it seems all events are treated as realtime and thus have no MIDI timestamping on the events when output to the interface) Most of the jitter peaks (0.74ms) in Logic X MIDI output occurs with data rushing ahead of time.
#OSX MIDI MONITOR SOFTWARE#
Using the MIDI jitter utility software its possible to display a waveform of how MIDI data jitters and possibly shifts over time. Average note on jitter = 0.09ms (max 0.74ms). However with the same UM880 USB interface and Logic X I got different results. OSX CoreMIDI timestamping actually worked. This was better timing than the Atari ST, and proved that USB having sloppy MIDI timing was an internet forum myth!!

This resulted in average note on jitter = 0.02ms (max 0.15ms) with this excellent timing being correlated across all of the eight MIDI out ports. I then moved on to my Edirol UM880 and Ableton Live 9. It was clear why the Atari was always considered tight for MIDI as these results were fantastic.

Taking a number of tests the MIDI jitter utility reported average note on jitter= 0.026ms RMS (max 0.18ms). I first tested the “Gold standard” Atari ST with Cubase V3, and to use these results as a benchmark. With the needed equipment I then set about to test my various sequencers and USB interfaces to see what exactly is going on, and if OSX MIDI timestamping was actually doing as intended, or just some marketing term. Of course this is only possible by way a MIDI lead which has been converted with an audio jack the other end. However many of these forum discussions would be filled with misinformation given the lack of any hard evidence either way.Ī few months ago a developer released a MIDI latency jitter utility, which allowed for automated analysis of MIDI bytes which had been recorded as audio. Having started off with an Atari ST, which was always considered the “Gold standard”, and then moving to Logic this was something that always interested me. For quite some time across various forums far and wide there have been discussions about if a DAW and a USB interface can provide solid, tight MIDI timing.
