Does anyone have a link to a single compresses file of the Rhythm Labs Breakbeats from here?
https://rhythm-lab.com/breakbeats
Downloading each file individually is killing me
Does anyone have a link to a single compresses file of the Rhythm Labs Breakbeats from here?
https://rhythm-lab.com/breakbeats
Downloading each file individually is killing me
ā¦iād say, calm ur collectors/must have it all horsesā¦
thatās pretty contraproductiveā¦
curate ur loopsā¦each time u start something new, u pick/choose one or two carefully curated ones, dedicated to the moment/state of mind ur in, to get it started, instead standing each time in front of an endless array of optionsā¦
I would just use something like the Down Them All extension.
Exactly. Make it a video game achievement about mastering each break. You canāt go to the next stage in Super Mario World until youāve finished the next one.
Some of these breaks take a long time to learn. Once you learn everything about it then you can make any drum kit shine.
I have the entire Paul Nice libraries and I strategically use only a few breaks between two volumes . Thatās how powerful being selective can be.
This is great! Thank you!
Sounds like OPās needs have been met using the browser extension linked above, but for those who like ācode golfā and want to avoid installing extensions a command line approach is:
wget https://rhythm-lab.com/breakbeats -O - | \
egrep -o 'https://.+/sstorage/[^"]+\.wav' | \
head -n 5 | \
wget -i -
This will download the first five files. Get rid of the line with head -n 5
to download 'em all ā though be careful as thereās more than 900 of 'em.
Youāre right. Iāve been doing this since 1993 and I still canāt help clepto-sample-mania
Does this work in MacOS Terminal?
MacOS doesnāt come with wget
by default so if you donāt have wget
on your system (say via Homebrew) than we can accomplish the same thing in a more verbose manner using curl
which is a default MacOS application. This solution is more complicated, and furthermore curl is picky about what it considers valid URLs so we need to do a little Python magic to encode the URL strings.
Unlike the wget
pipeline this is a probably best done in multiple steps:
curl https://rhythm-lab.com/breakbeats &> /dev/stdout | \
egrep -o 'https://.+/sstorage/[^"]+\.wav' | \
head -n 5 | \
python3 -c "import sys, urllib.parse as ul; [sys.stdout.write(ul.quote(l, safe=':/\n')) for l in sys.stdin]" \
> encoded_breakbeats.txt
xargs
and curl
xargs -n 1 curl -O < encoded_breakbeats.txt
As before, remove the head -n 5
line to get every wav file.
Note that the file names of the downloaded files via this approach are āURL encodedā (e.g. spaces are replaced by ā%20ā) ā this is kinda ugly. A little more script fu could be used to change the file names posthoc, though I leave that as an exercise for the reader The
wget
solution is definitely simpler.