I see plenty of benefit. Just not as much as if it had encoders and maybe LED rings.
I think both resolution and property exchange are killer features.
I have faith that the controllers are coming. Definitely not here yet. We need loads of controls like endless encoders, faders, knobs, and buttons all with little screens or display strips so it can tell you what’s being controlled and the value.
The interface should make it very easy to move things around. It will save templates so you can easily recall. It will have the ability to easily cycle through many connected instruments and dynamically change seamlessly. One controller interacting with all your gear.
This is the future I want to see and perhaps Arturia or Novation will get us there.
so far my impression about MIDI 2.0 is that it was invented for customers to bother both hardware and software vendors about MPE support.
however, i don’t hear any real-world benefits of MPE.
With all due respect, it is my impression that you are wrong on both accounts. But you are of course entitled to your opinion.
I remain skeptical in my lack of faith in it being adopted as a standard properly. I mean even midi 1, which of course is much simpler, still isn’t always implemented properly, for example there are still some companies insisting on using type B TRS, or not implementing note off handling correctly, or midi clock etc.
Midi 2.0 obviously adds a whole lot more complexity, I don’t have the optimism that companies will adopt everything properly.
Hopefully I’m wrong and just being overly cynical.
I’m very excited about the concrete improvements we’ll see with Midi 2.0 and not at all concerned that it’ll be a “meh” sort of thing after it happens. It will take time for all companies to implement it well, but ultimately it’ll provide a ton of functionality that’ll really improve our quality of life in the studio.
I appreciate very much that there’s a lot of focus on getting manufactures of both DAW’s and gear involved from the start and that it’s been a very deliberate process. It’s not going fast, but I can’t imagine there’s much, if any, money pushing this along, so it’s likely a lot of true believers putting in the time because they know it’s important.
My personal hope is that Midi-based timing gets somewhat more reliable/tighter, but in reading over some articles posted earlier in the thread, it sounds like jitter can be significantly improved upon, but latency will continue to be a bottleneck that goes beyond what Midi could solve within the spec itself. But still, a more robust standard that can offer more information and perhaps help to simplify timing problems and solutions is still a step in the right direction.
I’ve got nothing but optimism and hope around Midi 2.0 coming along. I enjoy reading the little updates as we move closer and closer towards it’s birth.
Historically, I think a lot of folks were implementing the spec from scratch. I’m guessing that for 2.0, we’re going to see a few open source implementations in different languages that get ported to whatever platform they need to run on, and manufacturers just using those. The way we develop software has changed a lot since midi 1.0, but I think the protocol dragged a lot of older practices with it. In a way, part of the problem was that it was so simple and every dev thought they could just bang out an implementation themselves — which they could, just not a fully correct one. That’s not to say that there won’t be plenty of midi 2.0 bugs too — it’s a complex new spec, after all — but they’re likely to be at a slightly higher level, once we get through an initial shaking out period.
I will say that it does have a bit of the smell of second system effect, though, as much as I do think it’s going in the right direction.
This is just me speculating, but maybe Electra One shows new MIDI 2.0 support at their Superbooth-booth.
Which also very well accounts for the room given with the new MkII hardware, with more processing power and memory.
The ERAE II configurable contact surface controller from Embodme is in Kickstarter right now and they are promising MIDI 2.0 support. ( See Q&A Video at 23:20. )
This should present some interesting challenges / opportunities in design. Do you get to set up surface elements that get negotiated in as you plug devices in ? It also has a host port so you could plug another controller into it, also potential a MIDI 2.0 controller. How this all works together with other systems seems an up in the air proposition, and at the moment i expect this to still be being refined at the target release date in September of this year.
I posted about the ERAE II over here.
Lemur
What electron collab would you wanna see?
last night i had a dream where i was jamming with a octatrack, monologue and a drumlogue. It felt fluent and sounded great. no midi cc table around it felt integrated.
The sequencers worked in tandem somehow.
After i woke up i thought more about it.
It was in 1981 when MIDI 1.0 that Dave Smith (Sequential) and Ikutaro Kakehashi (Roland Corporation) came together and changed our world foreever. Pioneers which sadly are not with us anymore. RiP
I love that stuff. But it is 2024 and I want more. xD
MIDI 2.0 is on but I think in our times we as musicians are the one to give birth to new ideas and how manufactures need to adapt to our needs.
Coopartion is good and bad as we know.
I dont read MIDI Specs. The only glimpse i got was the a new Keyboard from Korg with 2.0 were I think a bidirectional CC thing was going on. (don’t quote me on this ;p)
I feel a round table of japanese and elektron folks (add anyone you think should sit there but keep in mind it took 2 in early 80s) talk how they can make things align.
Ok, lets talk what are your wishes for the implementation of MIDI 2.0 are. Lets help out our beloved Synth manufactures. Dream on.
The only thing I can imagine is a sequencer automatically receiving all parameters it can modulate on a given device, including names. So when I configure an LFO on my Oxi One, I wouldn’t have to check a synth’s MIDI implementation table.
Other than that, I struggle to come up with ideas.
In reality it took a lot more than 2 people to make MIDI 1.0 come to fruition, but the idea was so good that it lasted us all the way up until now with very few changes. I have a hard time picturing things which can improve upon the universal language itself.
Mostly my desire is that multiple boxes could be controlled in a way that only an instrument with full integration would. So like, everything from one UI, seamlessly, as if it was all already in one package, but I’m not sure how to accomplish that with better midi.
It would take something with an operating system, like windows or mac, and the software from each individual maker that can then talk to each other and be fully manipulated from a central interface. Unfortunately, to some degree that already exists in what we call a computer
I don’t really like computers for music as much as some do, but I feel like even if you make dedicated super-midi hardware that can handle the task, then it’s basically just a specialty computer that is masquerading as hardware.
Haven’t seen anything like this in the spec and have no real idea how to implement but two thoughts:
How fast can midi oscillators go? Could you do some FM like stuff by modulating params at audio rate? I know it’s all just data but I wondered if 2.0 makes anything new possible like this that wasn’t before.
Secondly I’d love if MIDI clock could do things better cross device like multiple clocks, rubato, reverse, skip and fast forward.
As someone who often runs out of midi channels because I’m a fool doing silly things with my midihub etc I’m also excited about having 256.
The line rate is determined by the medium, so by default din and trs won’t give you another bandwidth. I believe but have not confirmed in the spec that negotiating faster line rates on those transports is in the spec, but you’re unlikely to see more than a few hundred kbps, meaning you might get up to 4-5khz just spamming a single CC and nothing else down the line, but not much faster. Midi 2.0 over USB will be using a realtime transport mode, as opposed to the bulk transfer mode that 1.0 uses, meaning sending audio-rate data should be possible, but I don’t know what the max frequency will be in practice. Midi RTP (over Ethernet) already supports a 10khz sample-aligned clock, but this doesn’t help without synths that can speak it directly, unless you’re using soft-synths, and even there I’m guessing you’ll find compatibility issues.
just read this article. Really exiting stuff. I hope other manufactures are adopting MIDI 2.0.
I think it is good we have the backwards capability but hope that new devices from elektron get up to the new advancement that is implied with 2.0.
Cheers
I don’t know what short term implications this has for users like us here but looks like good news anyway: MIDI 2.0 Profile for MPE.
Oh, and this:
MPE is a bridge.
Keep driving.