Post by wedoh on Jan 3, 2018 11:59:06 GMT -8
youtu.be/ee1wuZxiLsc
I think touch screens in combination with some physical tools like pen and rotary encoders like the once microsoft use is the future for controling DAWs. My guess is that a DAW tailored for touch will be more effecient then using a mouse and keyboard. One reason for this is that there is not a two step approach (you have to control your hand, and match your muscle memory with something you see move on a screen) for initiating an action.
For example:
When you get a stimuli based on what you hear "I want to control the LF band of my EQ" there are several steps to reach for this parameter.
On an analogue console, you remember where the potentiometer/pot is located, reach for it and physically rotate it. You feel the knob and know your actually doing some type of change based on previous experience.
With a mouse and keyboard on the other hand, you locate the plug-in and open it up, locate the encoder, move the mouse as an extension for your finger. You have to follow the movement of the mouse with your eyes. Imagine how hard it would be to position the mouse pointer on the same spot with the eyes closed. It would not be as hard to do the same by pointing a finger on a screen since you do not have to know the location of the mouse pointer, instead you know where your finger is located physically.
The same i think goes for moving a fader on the Raven, or controlling a plug-in, pushing a button etc.
I think whats missing is the sensation of touch, like described in the video. If you would physically feel the change, then you would be able to close your eyes when applying change to a fader and know your making a change of its location.
Imagine feeling the sound through touch on the fader you control, with increased haptic feedback as you increase the faders level. That way, you would get two cues of conformation that your actually changing something. Otherwise you would have to rely on only hearing. I believe its easier to focus on listening if you can rely on that change is made through touch. Also, humans have used touch and fingers to control stuff for thousands of years, therefor i dont believe a mouse gives the same sensation as touch with haptic feedback.
This is only theories i have, but i would like to se research being done in this area in the application of DAW control. I think it would be wise to look more into it for future updates of the Raven. I think one reason so many go for hardware controllers instead of the MTI is due to the lack of feeling what you control with the touch of your fingers.
To come around this issue in my studio today i use a CC121 physical controller for Cubase. I reach for parameters and tracks in Cubase on the raven screen with my finger, select them and change their values with the physical controller. To do this i keep my left hand on the CC121 and right hand pointing on the Raven.
This way i mark what i want to control and dont have to translate a name on the controller into what i see on the screen.
For example:
When you use the Pro Tools Control app on the iPad or the Pro Tools Dock, you will have to first locate what you want to apply change to on your screen, move your sight to the iPad screen, translate the function you want to change into the name its being given on the Control app, and change its value with a knob.
Instead, i reach for the parameter on the raven with my finger, i push it and change its value with the AI knob. Or use the fader/pan on the physical controller.
The AI knob does not work with all plug-ins, but enough plug-ins for what i do. Also, sometimes its not usefull because it does move the parameter on the screen to slow compared to the physical movement of the AI-knob, therefor i have to rotate the encoder a lot, with little change on the plug-in.
I would love to hear others thoughts on this. I believe haptic feedback through technology like the one in the video is the future for touch screens, and DAW control through touch.
I think touch screens in combination with some physical tools like pen and rotary encoders like the once microsoft use is the future for controling DAWs. My guess is that a DAW tailored for touch will be more effecient then using a mouse and keyboard. One reason for this is that there is not a two step approach (you have to control your hand, and match your muscle memory with something you see move on a screen) for initiating an action.
For example:
When you get a stimuli based on what you hear "I want to control the LF band of my EQ" there are several steps to reach for this parameter.
On an analogue console, you remember where the potentiometer/pot is located, reach for it and physically rotate it. You feel the knob and know your actually doing some type of change based on previous experience.
With a mouse and keyboard on the other hand, you locate the plug-in and open it up, locate the encoder, move the mouse as an extension for your finger. You have to follow the movement of the mouse with your eyes. Imagine how hard it would be to position the mouse pointer on the same spot with the eyes closed. It would not be as hard to do the same by pointing a finger on a screen since you do not have to know the location of the mouse pointer, instead you know where your finger is located physically.
The same i think goes for moving a fader on the Raven, or controlling a plug-in, pushing a button etc.
I think whats missing is the sensation of touch, like described in the video. If you would physically feel the change, then you would be able to close your eyes when applying change to a fader and know your making a change of its location.
Imagine feeling the sound through touch on the fader you control, with increased haptic feedback as you increase the faders level. That way, you would get two cues of conformation that your actually changing something. Otherwise you would have to rely on only hearing. I believe its easier to focus on listening if you can rely on that change is made through touch. Also, humans have used touch and fingers to control stuff for thousands of years, therefor i dont believe a mouse gives the same sensation as touch with haptic feedback.
This is only theories i have, but i would like to se research being done in this area in the application of DAW control. I think it would be wise to look more into it for future updates of the Raven. I think one reason so many go for hardware controllers instead of the MTI is due to the lack of feeling what you control with the touch of your fingers.
To come around this issue in my studio today i use a CC121 physical controller for Cubase. I reach for parameters and tracks in Cubase on the raven screen with my finger, select them and change their values with the physical controller. To do this i keep my left hand on the CC121 and right hand pointing on the Raven.
This way i mark what i want to control and dont have to translate a name on the controller into what i see on the screen.
For example:
When you use the Pro Tools Control app on the iPad or the Pro Tools Dock, you will have to first locate what you want to apply change to on your screen, move your sight to the iPad screen, translate the function you want to change into the name its being given on the Control app, and change its value with a knob.
Instead, i reach for the parameter on the raven with my finger, i push it and change its value with the AI knob. Or use the fader/pan on the physical controller.
The AI knob does not work with all plug-ins, but enough plug-ins for what i do. Also, sometimes its not usefull because it does move the parameter on the screen to slow compared to the physical movement of the AI-knob, therefor i have to rotate the encoder a lot, with little change on the plug-in.
I would love to hear others thoughts on this. I believe haptic feedback through technology like the one in the video is the future for touch screens, and DAW control through touch.