iPad Music Apps?

I’m now leaning towards spending the $10 on Xequence 2 instead of the Factory IAP. Been reading through the thread on the AB forum and it looks intriguing enough to me.

SevenSystems plans to raise the price next week.

Nice setup

I JUST got Atom so Ill mess with it before anything else but out of curiosity how do the two compare?

My experience with AUM and music apps on iPad Pro over the last couple of months (as a live performance and composition tool) is that it’s awesome when it works, but there are still significant barriers to it being a serious tool that you can rely on. I’m back on my Mac full time now and It’s easier, more stable and less effort in general.

Just my experience, maybe I’m doing things wrong, would love to know who would take it on stage as it is now!

3 main issues I experienced:

1 - lack of standard interface support across all apps Some apps have some form of midi, but lots don’t support program changes etc. Some apps support link, some support midi, IAA, some apps are AU3, so if you find an app you really like, chances are you’re going to have to put some shim in to get it working with the other apps you’re using. One example is to get AUM to follow the clock from my digitakt, I need to run AUM inside of AudioBus. So I’ve got apps running in apps running in apps, just to get external clock in!

2 - stability - apps crash within AUM (because they’re sandboxes and run out of memory I guess?) way more than plugins would crash on a Mac. Can’t have that while performing or recording.

3 - lack of support for apps. This is changing but lots of app makers clearly don’t make enough money to support their excellent apps, so bit rot creeps in and they become increasingly unusable over time. I wouldn’t want to rely on many of the apps I’m messing with over a long period of time.

I really hope the new inter app audio standard apple are pushing in iOS cleans things up. Id love to see Ableton or Bitwig etc invest into iOS so there’s some real momentum for driving standard interfaces and cross app functionality .

2 Likes

Patterning is just amazing.

I’ve used the iPad exclusively for music for years, and for general computing too. But I too am switching back to my old, old Mac for recording, for the same reasons as you.

My main gripe is that to process a sample you need to load the same sample into each app individually. How I work, this eats up memory quick. It’s not a serious music making hub. Just a super nice sketch pad with lots of great materials.

Thank you for revealing your workflow… this piece is amazing and reminds me of works of Loscil and other drone music artists.
Your explanation gave me a good start to mess around with the spectrum suite! Thanks mate!

1 Like

I am honestly not sure. The description of MIDi time stretching in the other forum thread look appealing, but apparently Atom also has it.

I did read on a recent thread that Atom always merges its MIDI data, so that its output is only on one MIDI channel all the time. There is also no MIDI export.

There are a couple of users on the other thread who are using Atom and Xequence 2 together, generally recording Atom MIDI output into Xequence 2.

This should be changing with the support for Files on iOS (which is being improved in iOS 13) but as someone mentioned, one issue is that many apps don’t get updates any more so a lot of old apps are stuck with AudioShare etc.

Also did something with the new Spectrum AUs. A resonator is being driven with Rozeta Particles while a Stepbud instance is driving another Resonator and Spectrum instance. The same Stepbud is being scaled down and being sent to Quanta. All being to sent to Kleverb and Granular!

Video:

Audio:

3 Likes

Ok, so I dug a bit more into Xequence and where it fits in the IOS ecosystem with similar-sounding MIDI recording apps like Atom and Photon.

Atom and Photon are AU MIDI apps that are meant to sit inside an AU most like AUM. One user replied to my inquiry on the AB forum that he likes using Atom to quickly capture jams and Xequence for further development of the captured musical ideas (MIDI) - but then he also states he’s starting to use Xequence more than Atom

The Xequence 2 manual includes use case like these:

  • Sequencing multiple synths loaded into AUM
  • Redirecting MIDI from an external arpeggiator (StepPolyArp) to a Xequence MIDI instrument, with option to record the MIDI
  • Using Xequence, AUM, and Audiobus together

http://seven.systems/xequence2/en/manual/

Some thoughts I gathered on Atom vs. Photon:

Atom

No MIDI export
MIDI editing
Universal

Photon

Groove facility
MIDI export
Start your loop in different places
Only basic MIDI editing
No iPhone version

From the Photon dev:

Think of Photon as more of a session recorder (mpe and all non-system midi messages), so sits well as a passthrough device capturing your midi and sharing it. I think it will complement ATOM well when ATOM gets midi import/export/share.

User comment:

Well said. I see the two as complimenting each other nicely. I think Photon can be especially useful for just noodling around. Rather than setting up six Atom instances, deciding on the loop lengths in advance, then flipping around between them to start and stop, you have six slots to work with all in one place. Set up properly, you can use Photon to transparently suck up all the midi from a live jam, and play it back when you’re done. Kind of like AUM’s audio recording capability, but for midi.

3 Likes

I sure am digging the iPad these days. Thanks for sharing will have some fun with this for sure.

1 Like
1 Like

Been following the AB forum discussion concerning the above video.

I tried following the “Xequence 2 with StepPolyArp” example in the manual, subbing Arpeggionome Pro for StepPolyArp, and was able to get Xequence 2 to record Arpeggionome Pro MIDI data, while Arpeggionome was playing Spectrum’s Resonator (hosted in AUM) at the same time. I was also able to get Xequence 2 to replay the MIDI data back out to Resonator.

However, I was unable to get Rozeta Particles (in AUM) to work similarly with Xequence. This video will help a lot since it shows an example of Rozeta Collider playing another AUM-hosted synth, and MIDI data being recorded into Xequence.

The dev posted this response:

Two small suggestions: It is preferable to send the MIDI directly to AUM (select “AUM” as MIDI destination) and enable “‘AUM’ Destination” instead of “Xequence Source” in AUM’s channel popups. No need to turn on “Virtual Source” in Xequence then, and you will get better timing. Also, it is easier to see the actual MIDI routing at a glance.

(“Virtual Source” is primarily meant as a last resort when you want to control an app that doesn’t have a virtual destination).

And there’s no need to set the recording sources for a track if you don’t want to do Multitrack Recording – everything that’s received from any source or channel simply gets recorded on the current track :slight_smile:

Again, thanks for the great tutorial and hands-on video!

1 Like

Wow great info. Thanks!

1 Like

Eventide :slight_smile:



16 Likes

Woo!!

Oh wow, interesting to see more big players get involved on iOS. Might have to pick up Blackhole, love that effect on desktop

Dauhm.

And on iPhone? Crazy.

I expect theyre CPU gluttons. Will put off until i get a newer iPad.

1 Like

Its great to see eventide bringing their DSP to iOS land. We’ve been lacking good sounding delays and reverbs for years now.

Fabfilters and eventides as AU are tightening the gap between DAW musicing and iOS musicing even further, as far as sonics are concerned.

5 Likes