this is the boat im in. I use a 2 iPad solution live. right now i have iPads routed via ADAT to a MOTU, but i miss a real mixer interface.
I’m looking for using that usb like you for the A&H. I’ll probably have to go with the 96, but I can’t justify the cost right now. If the prices come down in the used market a bit more, i’ll probably dump the MOTU/ADAT solution all together.
I’m looking at some other options that are class compliant, but I’d still like it to be a universal piece of gear to stick on a rider.
Drambo you can use different # of steps utilizing the jump module. You also don’t even have to use the dramb sequencer at all. Can use the same sequencers you have in AUM.
Well, my OT is hooked in MIDI Out to the MIDI In of a BF Pro which is class compliant and therefore can communicate with the iPad.
In the apps i select the Midi channel to receive CC/notes and open them within Aum. In Aum Midi routing, it’s a bit strange: i have two BF Pro ports showing up and i have to connect the 1 with … the 1.
I don’t really get it but it works fine that way
Can this be achieved by adding a “Jump to step 1” condition thing to a given step? This is how I’ve been working around the lack of individual track lengths, e.g. put a “Jump” on step 16 to have a 4 beat loop in a longer pattern.
Been playing a lot of with Drambo in the last few days and selling a friend on it. Really an incredible app. So much flexibilty and power, sounds so great, but still straightforward to use. Being able to do all this on the sofa with a machine which doesn’t get hot or anything is amazing and honestly is making all my hardware feel slightly redundant right now lol.
One thing which I’ve found slightly tricky is tweaking small controls, especially e.g. the “modulation amount” dials. It’s hard to move a tiny amount which can be an issue for some sounds. Is there a way to finely adjust these parameters that I am missing?
Edit: and as you say, the developer seems incredibly driven to improve the app, and is extremely talented. I’m confident that anything we are missing (e.g. scale per track) will be implemented at some point, and will be implemented in a clever and flexible way (as e.g. AudioUnits were).
That is just MIDI control though which is a breeze to setup. AUM doesn’t sync clock though. So, if you want the OT to start/stop and send the tempo to AUM, it cannot. Any sequencers or time based effects in AUM will not start stop or stay in sync. That’s. the issue
Possibly. I’ve just not put too much time into Drambo as it is still missing a few things for me. The recent update to add external sync to standalone was big for me. I then found out about the audio input issue of only being able to tap the first two inputs of any interface. So, for my use case, it won’t work. Yet
I don’t know if there’s a fine adjust method yet. I “think” Reason Compact uses a method where you touch the control and pull off to either side. It then turns into a slider that let’s you do fine adjusting. Really nice and easy to use. Somebody should confirm that then put it on the Drambo dev’s radar. I have a Logitech Crayon that I bought for my iPad Pro. Very helpful for programming and dialing things in.
That’s whyI asked you if you were asking about AUM being Midi Slave. It can’t be Slave although Start/Stop is working for me as well as the sync with any FX or Instrument/Generator opened within, you just have to set the channels for each individually.
If it works that way, why would the developer bother to implement Midi Slave?
Depending on your setup and needs, the PX5 might work for you. I know the Xone:96 has to independent sound cards which you could assign to each iPad. The PX5 has a 10 in/10 out card. Though, I can’t find information on it, I believe you can set the USB input sources to ‘Line’ on the back of the mixer then set the channel input source to USB on the top panel. If it works like I believe it would, you would be able to plug your hardware, if any, into the mixer channels, run them through corresponding AUM channels with EQ/Compression etc. then back pre-fader/fx into the mixer. The PX5 can even be clocked externally via USB MIDI (not by DIN though). So all of your PX5 effects will be tempo sync’ed.
The Xone:96 is a gorgeous mixer but for me it’s A) expensive and B) a bit big to slog around. I have a decent amount of kit to move with me and I’m trying to reduce footprint. I don’t want a super mini setup just a bit smaller than it is now.
I’ll keep you posted in a PM regarding the PX5 or bump the very small thread regarding it and tag you in.
I understand that it cannot be slave. That is the problem. I fully understand how to setup MIDI control of parameters for any devices hosted in AUM or even for AUM’s controls itself. This includes mapping the transport controls to external messages. This is not clock slaving though. It will not stay in sync with my external BPM or respond to any tempo changes. It would be prone to drift and other issues.
This doesn’t work for time-based effects especially when using beat divisions in say a delay or pre-delay on a reverb plugin.
The issue with iPad apps in general is they see themselves as masters of their own domain. It seems inconceivable that musicians would want to incorporate them into hardware setups or setups that are more than just a pool of Link enabled software solutions. It is getting better and more devs are enabling and supporting external MIDI sync. AUM just refuses to add this feature. And trust me, it’s important and desired for people like myself (of which there are many).
I was also disappointed to see that Aum could not be Slave itself.
Once again, there’s a simple workaround that works: my synth apps are synced, effects as well, following the BPM given by the OT.
I don’t get the problem.
If you have time or energy to, can you provide a step by step of how you are achieving this?
Any apps I load in AUM look to AUM for clock and nothing else. That is one of the functions of a host; provide BPM and transport (Start/Stop). If AUM itself cannot be slaved to the OT then how do the apps it is hosting get clock?
Any apps I have, and I have many, have either the option to clock themselves independently from their host or to sync to their host.
I understand that you could set your OT to 128 BPM, set AUM to 128 BPM, and map the OT’s transport controls to the transport controls of AUM. You could even start and stop things and all would be well. Until the tempo drifts or jitter occurs or you change the OT’s tempo to 126. AUM would not follow suit then all would be out of sync.
So, if you’ve found some magical work around other than what I’ve generally detailed above, I would sincerely love to know how you are doing it. Because, without actual external MIDI sync, the above setup would be fragile at best.
Edit: And to be clear, I don’t care how the apps being hosted in AUM get external clock (other than through any of the “less-than-stable” solutions like nesting in Audiobus or MIDILinkSync). I just care that they get it and can follow changes to it. AUM’s transport doesn’t even need to start. I’m more than happy to host non-time based effects to process audio or use apps like Mononoke where I don’t care about clocking it (it’s just wild as is).
Have you considered using Drambo and AUM alongside (in parallel) Audiobus? As you may be aware, Audiobus indicates that it is possible to set Drambo and AUM as Midi Clock Destinations. This along with the fact that you can set Audiobus to slave to external Midi Clock sources might be a solution - particularly if ‘nesting’ is somehow problematic for you. If that doesn’t work - and it appears on first glance that it ought to, then the question that follows is, is there a problem with Audiobus’s implementation of Midi Clock - if so, perhaps take it up with the developer: https://forum.audiob.us/
When you manage to solve your Midi Clock conundrum, please let us know how it was done!
Re. track scaling - while Drambo has nothing as present that is precisely equivalent to the Digitakt (DT) for instance - there is the option of using step components which give you myriad combinations of cycle / random / scene conditions that lead to nonlinear jumps which potentially match anything that is possible on the DT in terms of polyrhythmic complexity - you just need to work with what’s there at present.
The other option within the latest version of Drambo is the introduction of a Euclidean step sequencer which when hooked up to drum instruments should give you another means of introducing polyrhythm within Drambo.
Maybe Mononoke don’t have a Midi channel to be set, i don’t know the app but all my other synths have.
That shows they were not thought to be only Master. Some don’t have the ability to be Slave as they were thought as generators. Some like Audiokit Synth don’t follow BPM but globally it is working on my side.
Could you provide a listof your synth apps? I could give a concrete example if we have one in common.
One explanation could be that you’re trying to use only Auv3 versionsof the apps. I can’t use them as it’s not working on my iPad but Auv3 will indeed depend on the host whereas if you open IAA versions it solves the problem and you can get external sync.
And this is most likely the case right here; I use almost exclusively AUv3 apps. IAA works in AUM but I believe what is actually happening is that the IAA app is hosted externally and its audio is bussed through AUM. In these cases, yes the IAA app would potentially have the ability to be slaved. This would be in a case by case basis. AUv3 is ultimately where Apple wants things to go as well AFAIK. I don’t know how long IAA is for this world. Might be around for awhile or Apple might kibosh it at the drop of a hat. Apple’s gonna Apple.
This. And also how are you getting MIDI into the ipad? Which interface?
Does mononoke not accept mdi being routed to it via the MIDI matrix in AUM (and bypassing the transport as you suggested?)
The problem isn’t mapping parameters or routing MIDI inside of AUM. It’s getting clock in. Unless something has changed, if I host apps in AUM I cannot send them clock unless I do one of 1) nesting it in Audiobus (unstable) 2) using MIDILinkSync (unstable with no transport sync).
AUM works great for every app I use. I can map anything I want. This is using many different external controllers; MIDI Fighter Twister, Korg Nanokontrol2, MIDI messages from various pieces of hardware. This can be through an audio interface (UR44C or Scarlette 2i2/2i4), or with a Roland UM-ONE v2 connected to a USB HUB. My issue isn’t setting up or controlling devices in AUM with MIDI. It works a treat! Honestly!
The issue is exclusively with external clock. AUM checks every. Single. Box. All of them. Except you cannot clock it which means any devices it is hosting cannot receive clock or use the benefits of clock. Delays can’t be tempo synced/divided. Pre-delays on reverbs, same. Samples can’t be tempo locked or launched in sync.
I totally understand that I could use Audiobus or MIDILinkSync but I’ve had stability/usability issues with both in the past. What I do for performance does not rely on the iPad. It would be a very much “nice to have” so I can do the things I can’t do with my hardware. Namely, Mononoke for drones and quantized sample playback of long samples. Add to this the desire to leverage the myriad delay and other time-based effect options in iOS.
I appeciate the suggestions. They are all ones that I have heard, tried, and ultimately, do not click with for reliablility reasons. I don’t want to put a ton of time and effort getting clock sync to “mostly work” only to have the whole thing shit the bed on stage. AUM is simple, brilliantly designed, but missing that one crucial feature.
Until that is addressed OR Drambo addresses the small quibbles I have (audio interface input routing), I will just use the iPad for sketches and such. I have the utmost confidence that Drambo will fit the bill very soon.
EDIT: Just read in the Drambo thread the roadmap for the next few features. Multi-route interace i/o followed by essentially pre track length and speed. Once complete this will address %99 of the snafus I have. MIDI feedback to my controllers would be the last niggle. If Drambo has a configurable MIDI take-over mode it would be a tolerable solution to lack of a proper feedback implementation. Love that the dev is active and engaged.
It does. No more parameter jumps.
And its also implemented on a per track basis, meaning you have have 1 encoder assigned to midi channel 1 and CC1, and use this on 16 different tracks. It will only affect the track selected (or you can se it to globally).
You really could use a controller in OMNI mode and tell track 2 to only respond to midi channel 2. It cleared up 99.9999999% of my midi issues, and this includes clock.
I cant tell you how many controller I’ve hand built and coded just for AUM control because it all absolute CCs. Drambo solved all of that with a simple update
j_liljedahl is the author. In this post you can see his thoughts, he doesn’t believe a DAW should be a slave to anything else: AUM MIDI CLOCK IN — Audiobus Forum
I’ve read that thread before. I’ve just come to accept the fact that the dev will never implement it. Many examples of apps supporting external clock sync are given in that thread. AUM would just be so brilliant if it did too.
I think Drambo is really going to be the way forward, even if it’s just for hosting AUv3 apps. There’s some IAA stuff I use as well that can’t be hosted in Drambo though AFAIK. Beatmaker 3 would almost be a good solution for my needs too.