Synth Talk

illustrations illustrations

An Interview With the Creator of Xequence

Published on Nov 15, 2019 by Synth Talk

placeholder

Alexander Ewering is the creator of Xequence , an advanced linear MIDI sequencer, editor, arranger and controller for iPhone and iPad. I had a chance to chat with Alexander and find out more about the thinking behind this beautifully designed app.

Background

How long have you been working with iOS music apps?

I started my first iOS app around 2014. Its successor is actually still in the App Store, it is called MusicFolder 2. My life in general has always been quite focused on music. I’ve always been involved with composing and producing in one way or another. A very long story that started with trackers on the PC around 1994 or so…

Do you have any tracks available?

Yes, I primarily focus on software development now, but I’m still making music with Xequence and the abundance of great iOS apps. I have a SoundCloud where I randomly put bits and pieces.

https://soundcloud.com/mind_in_motion

Did you create Xequence yourself or with help?

I’m currently the only developer. However, a big shoutout has to go out to the small, but amazing private beta team who are of immense help with testing obviously.

How did Xequence start?

Xequence really started as a pet project purely for my own enjoyment. There never was an actual plan to make Xequence a product. It’s mostly just by chance. Though I loved the sounds in Korg Gadget, the scene based sequencer did not fit my style of working. So, I started work on a pure linear MIDI sequencer. In the beginning, I would never have thought it would even get close to finished! In every aspect of the user interface and workflow design, I’ve always tried to approach one goal: to get my ideas as fast as possible from brain to device. It’s that simple really.

Features

You mention the user interface is fine-tuned in many non-obvious and intricate ways. Are there any aspects you want to highlight?

It’s hard to highlight individual aspects. It’s how the individual parts work together to form the user experience.

On the iPhone, the ruler is modal; you tap the ruler button to bring it up, then place your song position pointer where you want it, then the ruler automatically disappears. It saves screen real estate on an already small device.

Another example is the sliders: the slider module has many parameters to fine-tune the reaction to touch and movement. Every slider has its own, precisely tuned logarithmic response and sensitivity, so you can slide through a sensible value range without fiddling around too much. It may not be obvious, but many apps forget aspects like this, so maybe it’s a good thing to highlight here.

Also, you can touch the sliders anywhere, you don’t need to pinpoint a little circle, like, for example, in the iOS system sliders.

Then, all user interface elements try to respond as quickly as possible to user interaction, i.e., no unnecessary animations. If there are any animations, they’re very short just to give a very brief hint of what’s happening.

Can you explain what “Best-in-class MIDI timing and clock output” means?

MIDI timing has been a long-standing issue on iOS and I think Xequence might even have helped highlight it a bit. Some time ago, many apps used to send MIDI “in real time” to other apps or hardware. It looks obvious; when you develop an app that wants to send MIDI, say one note at 1 second, another at 1.2 seconds, the third at 1.4 seconds — you would just wait until 1, 1.2 and 1.4 seconds have passed and each time tell the operating system to send the MIDI note. However, that’s problematic because the operating system cannot work in real time. There will always be a processing delay, so your notes will not get to the other app or hardware at precisely 1, 1.2 or 1.4 seconds, but maybe 10 milliseconds earlier or later. 10 milliseconds is a huge amount of time in musical terms, so that’s why many apps have jitter.

It’s not that problematic with many musical styles such as slower music, or music which is supposed to sound “human” anyway. But with tight EDM, which I mostly make, it is really very noticeable and absolutely exact timing is crucial.

How did you improve the MIDI timing in Xequence?

Xequence does not wait until the precise moment a note has to be sent, but it attaches a so-called timestamp to it. For example “this note has to sound exactly at 1.4938 seconds.” Then sends it off to iOS BEFORE it is actually supposed to sound. And iOS is smart about these timestamps and also knows if the RECEIVING apps support these timestamps, it can even forward the MIDI data to them immediately. The receiving apps know several hundred milliseconds in advance the exact time when a note or drum is supposed to sound. In the end, you can get sample-accurate MIDI timing. Many more apps support this now, possibly because of my incessant complaining to developers and on the Audiobus forum. Timing has improved across the board.

The Polyhymnia feature is like an entire app within an app. How did you come up with that idea?

I never really played with algorithmic music creation, but throughout my composing, I often noticed that many melody lines and drum parts follow mathematical patterns. I’m also quite bad at coming up with musical ideas from scratch! So, again out of pure egoism, PolyHymnia was born! You’ve probably noticed that it’s very mathematical and you can actually choose the waveforms, along with phase and frequency. But it all goes a bit deeper. For example, the “Auto-Generate Settings” page has quite a bit of “intelligence” built in to ensure it actually generates parameters that yield musically useful results. It doesn’t just set all sliders in all modules to random values. A lot of experimentation and fine-tuning of formulas etc. went into the “Auto-Generate Settings” so that the “presets” it generates are useful most of the time.

How did you arrive at the multi track recording feature?

That’s one example of a feature that was added purely due to user requests. It was interesting too because I had no demand for it personally, so I wasn’t exactly sure how to implement it efficiently in terms of workflow. I based it almost exclusively on user input. I do listen to users!

Anything you want to say about IAA or AUv3?

About IAA: It is officially deprecated by Apple and definitely on the way out. Xequence shows up as an IAA app in other hosts, so I get many questions about it, for example why Xequence doesn’t present an IAA “Return to Host” button. However, the fact that Xequence shows up there is just a side-effect of its Audiobus support. Audiobus internally still uses IAA for routing audio and MIDI, and unfortunately Xequence cannot be hidden from the list of IAA apps.

About AUv3: I received many requests about an AUv3 version of Xequence so that the routing/state saving part of a project becomes easier. I must admit I’m still not convinced that having such a complex app that essentially requires full-screen anyway would be well-suited for being hosted as an AUv3. I try to make routing MIDI as easy as possible via normal CoreMIDI and Audiobus, including state-saving. I think making it AUv3 would not yield a huge improvement, while being technically quite challenging. You also get the whole file management problem with AUv3.

I’ve chosen a different route, namely to make individual Xequence modules available as separate AUv3 apps. So far, the keyboard and drum pads are available in the App Store. I think that’s a good compromise.

Looking Forward

Can you share any future plans for Xequence?

There’s a very long roadmap of features and improvements that have been proposed by users and many that I would like to see myself. I’m confident that most of them will see the light of day at some point! As I mentioned earlier, Xequence was born out of my own desire to make music efficiently on the go. It’s not a “product”, but it’s my baby and so I will take care of it.

Here’s a rough overview of possible things to come! No guarantees.

- Ghost notes / controllers. See notes or controllers for the same instrument greyed out in the background while editing in the piano roll or controller editors, to give a sense of context. Very high on the list.

- Improvements for odd time signatures, and more options for swing.

- The ability to load and save instruments, so that for example controller mappings for a certain device or app don’t have to be re-made all the time.

- One big feature that would probably add enormous value would be AUv3 hosting, so that Xequence would be able to directly load AUv3s into a project. That would make it a full host and arguably a DAW if audio tracks are also added. That is for the long-term roadmap, but it is very much under consideration!

How can your customers help you?

They are already of great help and I couldn’t really ask for better customers most of the time. 95% of the feedback I get is very constructive and helpful. I enjoy reading most emails and forum posts that come up.

Is there anything else you would like to add?

As I said, I really love Xequence like a baby and I don’t see it as a product, as it wasn’t intended as one right from the start. Also, while it doesn’t make me a whole living, it is more successful than I expected and seeing happy customers and useful feedback adds more motivation for maintaining and enhancing it. So, barring natural disasters and nuclear wars, users can be assured that the app will continue to evolve and improve.

Conclusion

Many popular iOS music apps are created by independent developers, like Alexander, who work hard to create a great product. There will be more articles to come highlighting the person and the process behind the app. Check back here or subscribe to the newsletter if you would like to hear more Synth Talk.

Xequence 2 website

Xequence 2 on the App Store

If you have suggestions or topics you want covered please contact me. 🙂

comments powered by Disqus