Overbridge latency in Ableton Live

Anyone try the latest betas? Do they still add 30-50ms of latency to your sets?

1 Like

No.

I assume you mean ā€œno they do not add the latency.ā€

Are you able to tell how much latency the OB plugin uses inside of Ableton?

To do that, simply hover your mouse over the Ableton device title bar for the OB plugin, and Ableton will show in the info bar at the bottom the amount of latency the plugin is reporting and adding.

Many thanks.

1 Like

About 27ms in Studio One 4.

This is to be expected. DAWs automatically compensate for this plugin latency.

1 Like

Delay compensation has nothing to do with this. I explained this a lot above.

It is maddening to me that apparently Elektron seems to share the same mentality expressed in your comment. No offense.

People who play their instruments lose their minds at anything above about 10 ms latency. 20 to 30 to 50 ms is a complete non-starter.

No one who plays instruments is going to use Overbridge with latencies this high. No one.

But, thank you the report.

1 Like

Iā€™m an instrumentalist myself so I understand you perfectly well in fact :slight_smile:

I merely answered your question about what the latency is.

Iā€™m afraid that this is just how things work and itā€™s not about not caring. There are many plugins that add latency like Ozone 8 or Neutron and many more. Latency just happens with certain processing. Especially when it comes to round trip stuff where audio and midi have to go back and forth over USB.

Iā€™m sure in the future this can be solved like how audio interfaces have become much better over time.

1 Like

Cool. I appreciate your thoughts and you taking the time.

To me, one good way to address it might be to allow the plug-in to control and sync the machines with a lower latency. In that case, the audio would be monitored directly.

But, admittedly, I am armchair engineering now. I recognize that this is all more complicated in real life.

But, part of me believes that if the engineering ambitions were lowered, a compromise could be made as I described above.

2 Likes

Even if it was possible by going that route, it would mean sacrificing other options that many users want. You canā€™t make everybody happy :wink:

Iā€™ve recorded many guitar, bass guitar and percussion parts for songs I used to write with old crappy audio interfaces and old daws that didnā€™t have the automatic delay compensation after recording. I changed my workflow to accommodate that. Simply recording as much of the instrument parts first, then go into processing stage and then mixing stage. If I had to record a part over again, Iā€™d dissable all fx and software instruments, then after recordings turned them back on.

Imho itā€™s better to accept how things work and adapt the workflow to these limitations.

6 Likes

All good thoughts here, but itā€™s still just impossible to use it as a plugin on stage. So sad.

1 Like

A solution would be to use an audio interface with 4 outputs, route everything you want to Heat through outputs 3&4 and back into the audio interface.

1 Like

But, you would still have the latency. You would feel it when you were playing. I donā€™t see how that is a solution.

Apologies if my tone sounds rude. Not trying to be!

1 Like

I think @DaveMech is suggesting to route the audio, in to and out from the heat, using your normal soundcard. That way, the only latency youā€™d get is the latency from the soundcard itself (avoiding the heat/OB/USB audio path)

Youā€™d get the management and control features of OB, but youā€™d ignore the audio part.

@DaveMech will tell me if Iā€™m telling lies about what he means :slight_smile:

(I just tried this with my DT, and it seems to work just fine. I can control the DT using OB, and route the audio out via the main outs.)

If you add the OB plugin to your set and itā€™s enabled, then you incur the latency hit.

I used to do a similar thing with my RYTM. Iā€™d use the OB plugin for programming and then switch it off when I was done to avoid the latency.

1 Like

Ah ok. Iā€™m not experienced enough with using VSTs to comment further on that.

I know itā€™s not what you want, but in case it is helpful, there are standalone apps for the DT and the AH (so far) which allow you to control the device using the app.

A bit of a n00b question, but (in the case where we donā€™t want audio from the VST) what is the problem with this? Is it that the DAW compensates and delays the rest of the tracks? If so, isnā€™t that something you can manually force to 0?

The fact that DAWs compensate plugin latency has nothing to do with the fact that you still have latency when you play software instruments.

This has been the case since PDC was invented.

The reason a DAW can compensate latency is because it knows when the recorded notes are being played. It obviously cannot do that when you are playing live, because it cannot read your mind.

The only question is whether that latency bothers you.

30ms is a non-starter for most musicians.

1 Like

I think the reason many Elektron users are not bothered by this is that they are more programmers than ā€œplayersā€ in the classic sense.

Both are equally valid artists in my opinion. But, the fact is, programmer artists donā€™t have any issue with latency.

Right, in the context of the heat, which I was thinking about at first, it isnā€™t going to be receiving many ā€œnotesā€, itā€™s no big deal, because you could just use audio interfaces, because there arenā€™t many of them.

But with a DT, or an A4, a big point of using the VST is to get the many channels of audio via the USB interface. So thereā€™s no escaping the relevance of the VST latency. Unless youā€™re not playing notes live, of course.

(I havenā€™t measured, but perhaps thereā€™s less latency using the audio drivers, instead of the VST.)