Midi 2.0

Yes potentially. I think there would be some raw processing speed needed and the USB may need to be at a certain level, but this could potentially be added to others existing controllers. It may be a high bar though.

I noticed that for instance Native Instruments bumped up the processor in the Mk3 controller, no doubt in part due to these needs.

I am hopeful for the synth / controller combinations like the Hydrasynths and the Osmose. I’d like to see the implementation on both sides. Glen has mentioned the issues implementing MIDI 2.0 so they’ve at least looked at it. Remember too that they added MPE after in a firmware upgrade, though they may be running out of room for too much more in firmware.

2 Likes

@Jukka thank you always for your thoughts and insights! Question:

Who is Glen? :slight_smile:

Oh sorry.

Glen Darcey CEO of ASM / Hydrasynth.

1 Like
Glenn who?

https://www.youtube.com/watch?v=aME0qvhZ37o&t=19

This is a very very small part of the things new in the latest Cubase 13 release, but it matters :

image

The way they say “Ready for MIDI 2.0” strikes me as funny. Why not saying something more active and definite like “Support for MIDI 2.0”.

MIDI 2.0 support is something you’d have to expect from Yamaha Steinberg.

This is one more piece, the DAWs are ready, and there are controllers, and a few more software synths. Hardware synths next.

1 Like

( Continued from this post. )

MIDI 2.0 can be very useful with controllers, where the controls are used for multiple purposes, and are changing dynamically.

But MIDI 2.0 is still useful for more static control surroundings, the Property Exchange Protocol ( PEP ) still being useful when changing between these more static setups. And MIDI 2.0 is definitely more useful for its precision and depth.

Hardware and software synths where NRPN or MPE was needed, are prime targets for a MIDI 2.0 interface. Properties of the PEP could be stored within a patch. For instance think about making a synth’s macros available to a controller via MIDI 2.0.

I am also thinking about how MIDI 2.0 could be applied in a mostly CV controlled modular system. There are currently no MIDI 2.0 to CV controllers. ( Though people at Perfect Circuit have been considering this – search for “MIDI 2.0” at this link. ) It’s definitely something that would come at a premium, but i can see uses, flexibilities, and possible designs.

1 Like

Hi Passenger9463. What do you think of the Native Instruments Series S Mk3 keyboards ?

Native Instruments has a reputation of playing less well with synths outside their world, but i think that good MIDI 2.0 on these controllers would change this.

You can buy controllers like that right now, and you can get them to output network midi which it should be easy to convert to DIN via something like a Bome Box — look at the Skaarhoj line. The hardware is beautiful, rock solid, and infinitely flexible, but it also costs the world. I’d expect to see a midi 2.0 profile mode from them at some point, and building your own firmware to run on their hardware is totally doable. Otoh, we’re talking OB6 money for a four knob, for fader, sixteen button controller.

1 Like

So i see that Bome has two applications for four OSs – macOS, Windows, Linux (x86-64), and Raspberry Pi OS.

They are called the Bome MIDI-CI Tools for MIDI 2.0 and MIDI-CI developers: There is the Bome MIDI-CI Initiator and Bome MIDI-CI Responder . Basic glue pieces for MIDI 2.0 implimentations.

CI = Capability Inquiry

They are available for download from Bome :

Bome will be at NAMM 2024 ( thread ), and i will be expecting more from them at that time.

1 Like

I am thinking an announcement from Bitwig regarding MIDI 2.0 is very possible. Bitwig will also be at NAMM 2024 and given their CLAP support that lines up with MIDI 2.0 as well.

From this Bitwig website :

Better Modulation

The CLAP standard promotes new ways to create music with automation, modulation, and expressions. Here are a few examples:

  • CLAP supports per-note automation and modulation (in accordance with the recent MIDI 2.0 specifications).
3 Likes

Oooh, very happy to see this from them!

1 Like

This is a wildly niche question, but does anyone know if there’s been any crossover between the SMTPE ST 2110-41 & -42 Extensible Fast
Metadata Transport & Formatting standards efforts and Midi 2.0? Particularly when it comes to interaction with AES67/AESX242?

It would be really nice to see midi RTP gain more penetration, and this would make natural sense with all the other changes they’re making, to see midi continue to be the default control surface communication and generic automation protocol as larger-scale productions move to converged IP platforms for uncompressed audio and video, metadata, lighting, and control flows.

Personally, I’d love to be able to manage it in the same way that I handle Dante audio in the house — not in the least because actually getting full Dante audio routing everywhere also means I need a distributed control surface fleet. In the meantime, I’ll probably try the little rtp bridge from Doremidi, but they’re pretty basic — one sender can talk to three listeners and one listener can merge (bidirectional) traffic from/to five listeners, but config is all pushing buttons to pair stuff. Having it in the same management tool as Dante would be great.

1 Like

I expect Bitwig will be one of the first DAWs to deeply implement MIDI 2.0, based on their history with MPE, and how forward thinking they appear to be. Is there any information on if MIDI 2.0 has tighter sync with external sequencers? I think I read it may, but I am not sure what that would take to implement with hardware. Having tight sync is something I know a ton of us would love to have without having to use 3rd party boxes…etc.

That depends on OS vendors and the folks who keep the USB spec agreeing with the midi folks to change the mode to something that’s realtime like audio. Doing so would break backwards compatibility with 1.0 entirely, so I suspect it will only happen via a negotiation protocol, which adds complexity to the driver, if at all. This is one of the reasons I’d like to see ethernet as a transport, and ideally something that can lean on Dante. Dante is sample accurate with 1ms latency port to port, and handles clocking transparently. It’d make getting midi in and out of computers accurately much easier, especially if you’re also using it for audio.

2 Likes

Dante is awesome for sure. Audinate is a pretty great company.

If you’d want to rely on Dante for this, equipment would see a price increase immediately as software and hardware solutions making use of Dante need to be licensed. Would be cool to have as an option though.
Maybe rtpMIDI would evolve into supporting MIDI 2.0…

Well, I’d prefer AES67, to be precise.

Would really be the dream, yeah.

Ravenna-people if you’re reading this, pretty please?

That’s the reason for the specific question I asked — AES X242 is/was the standards efforts to add parallel metadata streams to AES67, which I think would be roughly the right mechanism — a secondary RTP stream in the same connection. Add “midi” as a metadata stream type (they already support json and xml, inherited from the AES3 metadata work, so bare midi, possibly with some minimal framing headers, would be fine), and then allow metadata-only streams with no underlying audio connection, and you’re done.

I’m curious if they added metadata to MADI in the same way that they did to AES3, and, if so, how that relates to the MADI midi transport — if they were smart and reused the same mechanisms, we might get it for free-ish? We’d just need to add the “zero audio channel connection” concept. I’d read the specs, but I’m not $50-a-doc non-AES-member curious.