Kitten for OSX

it wasn’t easy! :stuck_out_tongue_winking_eye:

learned a ton about bits/bytes and computer memory during the process…
at some point even wrote my own little hex analyser thing just for this, to compare diffs between different kits/patterns/etc…:

it can read data straight from the machine, unwrap it from the sysex container and has a few modes for highlighting structures, comparing diffs etc…
basically: press space, tweak a knob, press space, look at data, rinse repeat… made things much easier.

then for every data type, there’s an OOP wrapper class and a bunch of helper functions so that working with the data is straightforward.

e.g. copying a track between two patterns, including p-locks & proper handling of track lengths etc.:

- (void)copyPattern:(ARPattern *)patternA track:(uint8_t)trackA
          toPattern:(ARPattern *)patternB track:(uint8_t)trackB
{
    [patternB clearTrack:trackB];
    
    ARPatternTimeMode sourceTimeMode = patternA.timeMode;
    ARPatternTimeMode targetTimeMode = patternB.timeMode;
    int sourceLen = sourceTimeMode == ARPatternTimeModeNormal ? patternA.masterLength : [patternA track:trackA].settings->trackLength;
    int targetLen = targetTimeMode == ARPatternTimeModeNormal ? patternB.masterLength : sourceLen;
    
    *[patternB track:trackB].settings = *[patternA track:trackA].settings;
    [patternB track:trackB].settings->trackLength = targetLen;
    
    int sourceStep = 0, targetStep = 0;
    for(; targetStep < targetLen; targetStep++)
    {
        ARTrig sourceTrig = [patternA trigAtStep:sourceStep inTrack:trackA];
        [patternB setTrig:sourceTrig atStep:targetStep inTrack:trackB];
        
        ARPVal locks[72];
        uint8_t numLocks = 0;
        if(ARLocksForTrackAndStep(patternA, sourceStep, trackA, locks, &numLocks))
        {
            for(int i = 0; i < numLocks; i++)
            {
                [patternB setLock:locks[i] atStep:targetStep inTrack:trackB];
            }
        }
        
        sourceStep++;
        if(sourceStep == sourceLen)
            sourceStep = 0;
    }
}

the entire MIDI communication is similarly abstracted away…

e.g. in the Kitten app, to load any kit into the current kit buffer, it’s just:

- (void) switchKit:(int)offset
{
    // keys is an array of string descriptors of the stuff you want to read from the machine.
    NSString *key = [NSString stringWithFormat:@"kit.%d", offset];
    NSLog(@"switching to %@", key);
    
    [ARRequest requestWithKeys:@[key]
             completionHandler:^(NSDictionary *dict) {
                 
                 ARKit *kit = dict[key];
                 [kit sendTemp]; // sendTemp sends kit to the current buffer
                 
             } errorHandler:^(NSError *err) {
                 
             }];
}

pretty much like a http request :smiley:
so yeah this stuff was so a ton of work but it’s very reliable now and allows new features to be developed quickly…

Now that is a thing of beauty! I made a kind of difference analyser in max to spot which bytes are changing, but looks like you made the red pill…

Thanks for the work void!

How many times a day do I need to feed my new Kitten? :joy:

128^12 times :stuck_out_tongue_winking_eye:

(the beginning got a bit weird, somehow there’s some stray button-pressing sounds in there :confused: )

you’re a genius !

and the slo mo jungle kinda beats around 4 min deserve a fat track on vinyl
:wink:

great :+1:

this is pretty fantastic. awesome work, void!

holy shit, you are an animal. thanks void! can’t wait to try this out.

whiners are so boring.

so uh, anybody using this?

I’ve been too busy with projects to give it a proper go.
Things are calming down now though and I’m hoping to use it this month…if it proves stable enough I’ll use it in a live set at the end of the month.

cool - started porting this to raspberry PI, not sure if worth the effort though if it doesn’t get usage… It’s pretty good functionality I think

Oh yes +1 for raspberry PI, as it should work as well on an Organelle from Critter & Guitari (it’s an ARM Cortex A9 running an Arch Linux).
And I have a mac but didn’t try it yet. :slight_smile:

hm cool, can the Organelle act as USB host?
the only dependency is ALSA RawMIDI via libasound2-dev.
otherwise a simple C++ UI-less program.

edit: if you are good with C++ lambdas get in touch - I’m kind of a noob with C++ and trying to figure out a thing with lambdas.

Edit: I see that the Organelle does not have a USB host plug - not sure if it’s worth doing this with the MIDI ports, would be a lot slower… also it’s hella expensive relative to a nimble vanilla rPI which is all you need for this, as Kitten is controlled from the Rytm…

edit: I was mistaken, Organelle can USB-host. Cool! Still super expensive for this particular application.

eh, wut ! woh … sweet, I’m in :+1:
exciting times - but is this only for high latency operations . ? !
if there’s scope for subtle midi hijacking in quick time, then I’m curious to hear about the method/technologies with a view to developing other ideas (& learning new chops)
.
my v1 pi is underused (basically USB midi hub once in a while) - but to have the incentive to twist that towards midi pal territory could be quite useful - if latency isn’t a deal-breaker
.
but as for risk/reward, who know’s - I’m excited by nearly-computerless ventures, plus there’s scope to easily hack hardware control on top too

my hello-world was a USB midi-clock to Korg sync-pulse converter via GPIO, it’s tight AF.
using a PI v2 here, but this is really basic stuff, it compiles & runs quick enough on the pi.

Nice to have would be an interactive debugger connection to the Mac, i.e. similar to iOS development, not sure how to set that up…

I definitely need to beta test that for ya, you know, just to be sure there are no catastrophic bugs lurking - clock dividing / swinging options ? :wink:

Yes, the Organelle can USB-host :slight_smile: (and actually doesn’t have any MIDI Din ports). It works well with the different USB MIDI compliant controllers and interfaces I tested. Not tried yet with an Elektron machine.
A UI-less program would be ideal. Unfortunately I didn’t touch a C++ line of code for about 10 years. I code in Java for my job but I have some knowledge in functional programming. I can try :slight_smile:

it’s pretty much just a syntax thing, afaik lambdas are relatively new in C++. Asked a question on stackoverflow: http://stackoverflow.com/questions/36429162/asynchronous-request-response-using-lambda

Basically just trying to replicate that ObjC request-response pattern in C++. It has proven to be very convenient for Elektron-sysex things. I can’t use the ObjC code on the PI…

Ah ok.
It seems you have your answer and a lambda can be used as a callback in C++.

Hi @void,
I made a small experiment this weekend to test MIDI out from the Organelle to the AR. It worked really well :).

Hi void - I’d like to try the second version of Kitten, but this link seems to give me the same version - 1.01(2) again?
I can change Kits but the FX SMP, ENV and LFO don’t do anything.