To my knowledge, the resolution is the same, it’s just the speed that is lowered.
live.remote~ takes an “audio” signal as input while live.object takes messages.
Once again, OSC has nothing to do with it. It’s only the way information is transmitted or received. It could be MIDI, radio waves, electricity, whatever. The best way is to simply ignore this part for now.
So in Live, to have feedback from a parameter, you simply have to observe its value. To do so, you use an object called… live.observer. You bind it to a parameter and it will then continuously output its value. Controlling and observing a parameter are completely independent processes. You can do either or both.
A few basic patchings to illustrate : (focusing on track 1 mixer volume as an example)
Thanks so much for this info! I’m gonna try and edit an OSC M4L device and see how it goes.
Sorry, I meant specifically the official OSC Touch OSC device bundled with the Connection Kit. Trying to understand how using a combination of the two devices, OSC TouchOSC, and OSC Send, could ever work for bi-directional feedback.
I know all kinds of wizadry is possible, look at the Knobbler app, somehow he gets labels and values sent from Ableton.
Ah ok,
I just checked inside the device, it indeed works with live.remote~. But it’s “burried” in a bpatcher contained in a bpatcher contained in a bpatcher. So not necessarily extremely indicated if you do not exactly know what you’re doing.
EDIT : Also, their device simply doesn’t allow for parameter feedback. Otherwise you’d have a second OSC port, for data going out.
EDIT 2 : I didn’t see the OSC Send device. Indeed it would be used for feedback. Also it’s a bit tedious as you have to map everything twice.
If I have a pattern in the Digitakt that’s quantized with “54%” swing, how do I set the Quantize on a MIDI clip in Live to match that exactly? Sorry I’m like really new to Live.
I tried setting it to: [1/16], Amount: [54%], but that’s not lining up perfectly. Clearly I’m not understanding something.
You can extract a groove from an audio clip in Live. Just record a bit from your Digitakt, extract the groove and use it to swing the stuff you’re doing in Live
Simply trying to get the ‘live.remote’ thingys changed to ‘live.object’ inside the official OSC TouchOSC device (included with the free Connection Kit pack).
Anyone can point me in the right direction, would be very grateful!
As I wrote, not that obvious at all in this case.
Also, careful when editing devices, you should copy them first as it’s easy to break them.
I think this conversation should have its own place as it’s technical and probably boring for most of the people watching this topic, isn’t it ?
I find Philip Meyer’s tutorials really clear… just recently watched the one on Live Object Model and he did a good one on the transofmation devices/ arrays
I’ve recently been thinking about a very minimal Ableton installation with no core library, just a small amount of my sample packs, max devices and a few Live packs. Has anyone else tried this? Is it even possible to remove or not install the core library?
I imagine it’s called “core” for a reason - lots of Simpler/Sampler, Granulator, Hybrid Reverb and Drum Rack presets will use those samples, IRs & loops and will otherwise fail w/o them. But I guess you can just try & remove them manually.
I don’t believe you can install Live w/o core library from the start.