When I use the Grid Machine on the Digitakt II, I notice that the slices are slightly off. It’s particularily noticable on drums loops.
I prepare them in Logic X, so that the individual hits are evenly distributed.
Then I convert them to 48kHz by picthing them up in ffmpeg.
In the Digitakt, when I play a single drum hit, I always hear a slight blip at the end from the consecutive sample.
Has anybody else noticed this?
I’m trying to identify, whether the Issue is with the Digitakt, Logic or ffmpeg.
Are the drum loops originally from digitakt or you actually create them in logic? Are you using the digitakt metronome as a point of reference? Even set to the same bpm, you probably can’t expect 2 independent devices to have a mathematically perfect synchronicity, therefore if you’re getting something that seems slightly off grid it either is off grid, or perhaps the transients on some hits are longer than the slice segment.
This is just my first thought, it’s definitely possible something else in the conversion is causing it to not cut at the zero-crossing. It’s also my understanding that the stereo samples also make it harder to achieve a zero-crossing slice as well.
Don’t you think synchronisity shouldn’t matter? As I understand it, the Grid Machine should slice samples up into even pieces regardless of tempo, so if the samples are evenly distributed it should be fine, no?
Also, doesn’t the DT automatically convert files upon ingestion anyway? I’d try removing this step of the procedure as it could be unnecessary, and may even be adding something unwanted to the end result (ie the file not perfectly matching up to the grid)
I do it to save space. I’m assuming both Logic and Transfer would upsample, while ffmpeg offers the possibility to just pitch up the sample to the specified sample rate, so it saves about 10% memory.
Have you tried doing the upsampling way and see, if your issues with not having grid line up correctly still occures?
I am quite sure, that just telling a file that it is now 48khz (and with that pitching it up) might cause some issues that might not be obvisous when just listening to it.
I use quite a lot breaks that I split up. As long as they are originally on the grid (with no swing) I dont have issues with that (at least I didnt notice it).
I’d be very surprised if this was not the problem.
The re-sample from 44.1 (or anything less than 48) up to 48 means the algorithm has to work out some individual samples from adjacent samples, so it wouldn’t be surprising that the last sample (or two ?) before a transient is influenced by the following transient ?
EDIT: Audacity is very good at showing what’s happening at the individual sample level. Maybe take a look in audacity at your before and after wavs.
That is definitely worth a try, although I believe that under the hood Upsampling in Transfer introduces more irregularities than simply picthing up, since the latter keeps the actual audio data exactly the same.
Yes because you are right about upsampling looks at adjacent sampling.
No because this is what I believe Transfer does, and I avoid exactly that by using ffmpeg.
Yes but in practice what actually happens? I’m with @bibenu in that id be surprised if your issue does not stem from the process that you are currently using (ie the conversion in ffmpeg)
Are you positive that the ffmpeg algorithm (which is converting from 44.1 to 48 I presume ? or did I misunderstand ?) cannot possibly be to blame here ? I wouldn’t be sure unless I actually looked at the before and after wavs (at the level of individual samples) in something like audacity.
I have to batch convert files all the time as part of my job and (slight) inconsistencies in file lengths following conversion are extremely common. So id be willing to put money on this being the source of the issue.
Also confused about exactly what @knods3k 's ffmpeg process does. I initially thought it must mean an algorithm that adds interpolated samples, which was later confirmed by this:
So, small file (at 44.1) gets bigger by 10% (at 48) so samples must have been added by interpolation, very likely with the effect I’ve been suggesting.
My suggestion could be completely wrong if ffmpeg is just changing the header so the same set of samples play back at a higher rate (and higher pitch) but as far as I can tell, that’s not what’s happening here.
Concur. Avoids the possibility of either ffmpeg or transfer messing things upinterpolating in a way that will introduce artefacts when split near transients.
Is the grid derived from sample length or does it run parallel to project bpm? Meaning does it lay them down together at the same starting point and then chop at the dt bpm grid?
It seems like in the first case, if the project tempo only allowed slighly shorter segments, like the length of a 16th note were shorter than the logic 16th note, then the slice would be truncated so that the next slice can play. I can see that causing a blip.
If it slices on a grid based on project bpm, then at whatever point the logic created grid drifted from the dt slice grid, it could create a similar truncation.
It’s been my experience with slices that they need to fall very close to the dt grid to get desirable results.