Kinda cool if you don’t like being tied to a computer, it’s pretty much the same MIDI capture function you get from Ableton, according to synth anatomy.
Pretty simple and very cool. Will convert a controller to bluetooth midi as well. I would assume velocity is recorded but I wonder about aftertouch/poly aftertouch. It says “all midi messages and all 16 channels are recorded” so maybe?
Yeah that means all of it. It’s just data, and a time stamp. ( Like a standard MIDI file. ) How they organize the storage is aignificant, probably by sessions and when it was recorded, with perhaps the ability for you to add labels and other classification information after. You should be able to selectively delete
Whether you can then edit and re-store is a question.
I did shoot the developer an email he and said yes aftertouch is recorded fyi. As someone who noodles on the keyboard pretty much daily, this is very appealing.
Eventually the problem will be the quamtity of data, especially if you’re not allowed database sorts of tagging and classification and search. It would start to look like the Elektronauts Forum, god help us.
If you watch the autonomous videos that I posted, it shows that it’s categorized by dates or I assume user category. He’s scrolling through it on the app.
It’s at the bottom of the screen, so like sessions. Would be an absolute nightmare if this was implemented without any kind of tagging or file separation.
Theoretically you could store multiple parts in MIDI, simultaneously like you and a band mate playing together, though it doesn’t sound like this is provided – unless you did a MIDI merge – you have channels 0-7, i have 8-15.
See what isn’t recorded though is which is connected to what, and how you may have externally configured you synth, and your mixer settings, etc, etc, etc. So that sort of thing would be important to mark down somehow. This is only MIDI data in your log.
One of the industry gab-magazines, was it SynthAnatomy ? compared it to the MIDI capture that happens in a DAW like Ableton. I get the comparison but this strikes me as different.
Seems like a good combination, you could have multiple devices hooked to the MRCC, and then curate and map some combination to an output hooked to the JamCorder, and even change that around if you chose.