Digging menu on a small screen is a workflow like using computers

Most so called hardware synths/grooveboxes/sequencers, etc even some “analog” gears have digging menu on small screens as part of the essential workflow. That has no difference with using computers.

There are a few product lines don’t cope with this old fashion computer-like workflow.

One is Novation Circuit. There is NO menu digging and there is no screen. Some argue the 8x4 grid is an ultra low resolution screen. Indeed, the grid is part of the interface of Circuit as it generates output and takes input.

Another is Sonicware Liven series. There is only a 4 digit LCD for parameter value and pattern number so there is NO menu digging.

Novation Launchpad is like a 64 pixel square iPad with buttons. The grid can be used in clips mode, mixer mode, note mode, chord mode, or anything else with devices/apps that support midi learning. It is really like an iPad.

It is pointless to judge if a particular hardware is a computer or not. We don’t use computers, we don’t even use software. We actually use the interface and the interface dictates the workflow.

The screen of iPad is an interface. Just like the buttons on Launchpad and all the knobs and faders on any device.

Here is iPad in one kind of note mode as an instrument/controller. A 9 x 12 grid interface plus a few knobs and buttons. Just like a Circuit.

Here is an iPad with a Launchpad. The user never touched the iPad during the whole performance and in fact he didn’t even have to LOOK AT the iPad. Is the user “using” the ipad or the launchpad? Is he using a computer or a dedicated “hardware”? If the user perform the same tune on the iPad screen doesn it change anything?

Any hardware gear with digging menu on a small screen workflow is like a fully auto “film” camera that also need batteries(for the tiny chips inside). That’s not a mechnical camera. As I mentioned there are very few product lines that offer pure hand-on workflow like using mechnical cameras without digging menu.

An iPad is definitely a computer but many apps offer ZERO menu digging pure hand-on operation on the touch screen, the interface. An iphone could be a toy camera or a computer. What a piece of gear can do is up to the users, not the other way around.

I leave the answer to members here. Thanks for your time.

Edit: typo

2 Likes

It’s okay to have two thoughts on something at the same time!

Computers/iPads are super powerful for making music. At the same time, some people don’t like using them for that purpose.

It’s going to be okay either way!

8 Likes

never hear the term ‘digging menu’ before.

3 Likes

Menu dive yes, digging menu no!

4 Likes

Yes, why not both?

I only use an iPad and a sustain pedal on the go. And I own a Launchpad X and an Midimax. For location/situaton I’m not comfortabe to pull out my iPad, I use Sonicware Liven 8bit Warps.

Have at it dude! Do what you like! If anyone tells you differently, maybe stop listening to them!

7 Likes

haha because usually there are layers of menu. Ya menu diving is more standard way to say so.

2 Likes

The difference is that it’s not the same type of interface that many of us are chained to for 40 hours a week.

2 Likes

Menu digging is just one aspect of what can make music hardware feel less immediate and tactile, aspects that I think are the gist of this topic, or at least that’s what I’m mostly interested in thinking about.

Ambiguous modes and hidden or ‘non-correlated’ controls are another big issue with a lot of hardware. With modes unless it’s made super obvious by clear changes in LED or screen patterns it can add to the user’s workload, you have to keep track of what mode you’re in at any given time and if there’s several ‘modules’ all in different modes it can get complicated fast.

Hidden or nested shift functions can be a big pain as we all know, but they’re really necessary from a practical point if we don’t want hardware that’s massive and covered with hundreds of buttons for dedicated functions in every context. Shift functions can be done pretty well if they maintain a consistent logic across different modes and contexts, I actually think the OT is not bad in this regard with things like copy/clear/paste etc.

What I mean by “non-correlated controls” is when you have a physical interface that doesn’t resemble the underlying structure. A lot of synths often do this where the osc/filter/amp/lfo sections are in blocks and don’t flow into each other. The Kobol Expander and Behringer’s remake are good examples of physical controls flowing in the same manner as the signal path, the clear labelling really helps -

And things like dedicated buttons this module select diagram on the Hydrasynth are infinitely better than scrolling through list items and submenus -

image

With a really well designed interface it should be possible to grasp at least the broader picture of the underlying structure just by looking at it. However this gets tricker to stay true to as more complexity is added. In general I much prefer potentiometer knobs to endless encoders because I can see at a glance what the current state of a machine is, obviously sometimes you need to use encoders otherwise you might end up with an Octatrack the size of a dinner table.

I don’t mind a little bit of deeper menu diving and small screen staring as long as it’s relegated to pre-configuration tasks that I don’t have to do too often. For anything that I might want to do to perform music I really want as much immediate hands on control over it as possible.

While big touchscreen interfaces like on an iPad offer a lot of advantages I just can’t stomach using them for anything really important. The lack of any meaningful haptic and tactile feedback kills muscle memory.

3 Likes

Fully agree with your take @GurtTractor . A well design interface is extremely import. The difference between a fretboard and a keyboard is arguablely more impactful than the difference what a guitar and piano sound like. It affect how the musician think more than anything else.

Speaking of haptic and tactile feeback. As a guitar/bass player I mostly feel the feedback from vibration of my classical guitar. So as long as the speaker is on or the ipad is placed right above the speaker. I do feel sort of the same feedback from the speaker vibration. when u touch the screen, it is vibrating :smile:

1 Like

Yeah I’m really interested in seeing more haptic feedback technology in devices.

No.

7 Likes

So, are you saying the “computer-like workflow” is the old fashioned one?

Personally I don’t find this general distinction important at all. I prefer working on hardware, even if I have to page through some menus. For me it’s not at all like working on a computer, but if someone says it is, that’s cool too

1 Like

No, because menu diving on a hardware unit hardly benefits from a larger screen (bad eyesight out of the equation) but menu diving on a computer will be a pain shown on a

1 Like

Who cares what it is, so long as it’s used to make sick tracks.

1 Like

Understand it’s all about feeling to you. And I’m glad u know what you like.

I’m here to layout a logic. When someone menu diving/digging, they are NOT interacting with the music.

When anyone hit a note, turn a filter knob, or even program a step sequence, they are interacting with their music, regardless the interface(a button or a touch screen or whatever design/material)

I bet quite some members here care or even swear to “hardwares” even those devices require endless menu diving/digging that stops the flow of music.

Menu diving is always there. Don’t get me wrong.

But menu diving preparing the jam/performance/music production is totally different with menu diving during the jam/performance/production.

I definitely prefer picking the program/pitch on a HUGE screen for 1 touch instead of endless page up/down on a tiny screen like most hardware devices.

I was just kidding while thinking seriously about the topic. I think in the end it depends on feel and the connection thus interaction and focus between man and machine, the perfect partner. And that depends on the person.

working with hardware screen is nothing like working on computer.

for me, having small dedicated screen with consistent UI and memorable tactical motions to do whatever needs to be done is faster and more productive then having one big monitor with inconsistent size / positions on screen that require both mouse interaction and having your eyes looking for a specific thing.
some things have keyboard shortcuts and can be done without the need for mouse/eye strain but most things don’t, it’s way faster for me to memorize menus like oldschool games where you had key combos and you memorized them very quickly and applied them with a reflex, you can’t do that when your desktop UI changing constantly, you have all the devices spread across the monitor and you need to look for that specific thing to touch or do.

well, then, why not use something like TouchOSC and map everything to a single page of ipad control? or have a custom midi controller like the intech ones that are fully customizable to order? easy solution.

I tried working with ipad but can’t stand touch screens, no way on earth I can control or achieve something faster with touch instead of dedicated key combos that I remember by instinct and can operate them blind.

obviously there are bad hardware UI examples too, things that require menu diving with a single rotary knob, which makes the process awkward and you can’t really rely on reflex of turning the knob knowing where you’re going to land, like the Roland TRs, but on elektron devices most things are pure muscle memory which makes the workflow way faster and memorable even if that specific thing you need requires couple of extra key presses.