Macro to re-route channel output?

To eliminate latency when I am recording Virtual Instruments I am using a hardware output being monitored thru a mixer when I record.

Does anyone know if it’s a possible to make a macro to switch output of the selected channel?

I can of course do it manually but it would be awesome to have it as a key command to quickly re-route the selected channel when recording both to save time and simplify the workflow.

Hi,

You can hold down Alt+Shift to enable Q-Link and change the routing of all selected channels at once in the MixConsole.

You can also consider to use Direct Routing and switch them.

I appreciate the suggestion but I don’t think I quite understand how this would simplify the process of directing a single VI output to an alternate output? (In this case output 3+4 instead of “Stereo”). Seems easier somehow just to change the routing manually. I was basically just looking for a way to put it on a key command.

The ultimate solution would be if the VI channel would switch to an alternate output automatically while recording, somewhat like UA LUNA does with ARM, but I suppose that’s out of the question?

Hi,

Now I got your use case, I’m sorry. So far I can’t come to any solution. But I will keep to thinking on it.

You should still take a look at Martin’s prior suggestion to use Direct Routing. While it is a manual operation and not a Key Command, once you set it up you only need to click on a button to use the output you want. When you change it using regular Routing you have to open the menu and navigate to the item you want. Direct Routing avoids all that. You’d have 2 buttons one named “Stereo” and the other “Outboard Monitor” and whichever you click on is where the signal will go. Stick it in a Template and it’s always there.

Depending on your specific needs Direct Routing also lets you send it to multiple destinations at the same time. So if it doesn’t cause a problem you could send the signal to both and stop switching altogether. If the signal is going to 2 destinations you could probably set it up so you could control which destination is ‘live’ by toggling their mutes. This you could do using Macros & PLE.

Using Direct Routing doesn’t work for low latency monitoring purposes since it latency compensates to align with other channels - which of course is necessary if you are using it for parallel processing and such - but not in this case.

The benefit of routing VI’s directly in the Routing section is that you monitor directly without any latency compensating induced delay and therefore can play VI’s while recording without any lag. Kind of a Low Latency Mode on stereoids since the mix stays the same - but you still don’t get any lag when you are playing. The way it needs to be for tight recordings. Somewhat along the lines of what Universal Audio is doing it with ARM in LUNA.

In my perfect world scenario there would be functional options for Low Latency Mode:
1) The old/traditional way that it works right now.
or
2) Automatic routing of the selected channel via a separate out (for people with hardware mixers or solutions like UAD Console where you can have virtual outs) + (possibly) bypass of any latency inducing plugins on the Master bus.
or
3) A combination of both.


It won’t give you NO latency at all but should DEFINITELY make it better.

In my case, what I usually find most problematic is playing/recording VI’s when the production is big enough to be causing latency.
I simply can’t lay down a really nice piano when the sound is lagging/being delayed when I press the keys. It is just one giant mood killer and frankly the biggest gripe I have with modern daws.

The low latency button in Cubase is decent, but usually the mix sounds so different when it’s engaged that it becomes distracting.
Doing it my way, the mix stays the same but the latency is greatly reduced anyway.

In short:
If you have a Universal Audio soundcard you also have “virtual” outputs as well as real hardware ones.
I have setup two virtual stereo outputs in Cubase in Studio/Audio Connections/Output.
(Mind you that depending on what sound card you have, the number of Virtual Channels is decided in the UA Console MENU/View/Settings/Hardware, at the bottom of the window there is “CHANNEL DSP PAIRING” where you can set if you want to prioritise DSP PAIRS or VIRTUAL CHANNELS).

Why TWO outputs?

  1. The first one isn’t really necessary but I find it extremely convienient to use it for the Stereo Out in Cubase (Studio/Audio Connections/Output) = You get Cubase on a fader in the UA mix console. This comes in handy when you need to adjust monitoring levels (vocal or VI against music and so on).
  2. This is the main one: When I need to record a piano VI or rhodes VI or whatever, I change the routing for that specific channel in the Cubase mixer/console to the 2nd virtual out. What happens then is that:
    a) you are suddenly monitoring the VI directly through a channel/fader in the UA console.
    b) the overall latency lag for the VI is greatly reduced since now it seems the overall latency compensation is taken out of the equation.

This means that I can use a 1024 buffer and still be able to play my VI without any crazy lag and the mix sounds the same while I do it = the inspiration keeps flowing. When I am done I re-route the VI in Cubase so it goes thru the Stereo bus again.

Done!

Note: If you have crazy amounts of latency inducing plugins on the track you are monitoring through in Cubase, there will of course be latency, but I usually only have the VI and an eq or something on the track I am monitoring through. Lower buffer = lower latency still holds through but crazy low buffers aren’t necessary for virtually lag free monitoring doing it this way. If I need FX for feel like reverb/delay I might add it in the UA console because there will be no lag or extra DSP load in Cubase.


I am sorry for jumping to the conclusion you had an Apollo, but unfortunately by not using a UAD soundcard (or possibly monitoring a separate out on the RME thru a desk?), I am afraid my latency ”fix” doesn’t really work. It’s the same with UA Luna, unless you are using an Apollo, the Accelerated Realtime Monitoring (ARM) won’t work and I am pretty sure you therefore can’t even run Luna since ARM is part of the core functionality and there is no way to solve that without an Apollo or Arrow.

With my solution I can an easily have a 1024 buffer and still be able to record without having any latency issues. Without Apollo’s virtual out - 1024 buffer = no way!


Tbh, I don’t think you are missing out on anything. In my opinion Luna won’t be anywhere near Cubase featurewise for a couple of years. There is so much missing and it’s so buggy at this point that I can understand why it’s not ported to windows yet. It feels like a giant beta test to me. I have no doubt that it will be great somewhere down the line but the time I spent with it now was wasted.

Sure, the tape sims and Neve summing sounds awesome and the ARM feature is nice but there is no ARA support (which means no Vocalign, no Revoice), there is no sidechain, no custom keyboard shortcuts (LUNA is mainly using Pro Tools shortcuts), no track presets and so on and on and on.

Luna feels like a really nice sounding desk+tape machine with a clunky and not so flexible interface to me so I have put it aside. I work 20 times faster in any other DAW at this point. Sure, some people will be ok with that and some are switching from Pro Tools and probably find it workable but going at a fraction of the speed I am used to doesn’t work for me.

However, I really like UA and I will most likely come back to Luna in the future when it’s more mature but not now.

You’re welcome! :slight_smile:

When I’m not using my direct out trick I am actually doing the same for my keyboards. I’ve got a laptop with a soundcard routed directly into my Apollo so I can play and record synths/rhodes-plugins etc without the vibe-killing latency that seems to be an unavoidable part of recording nowadays. To avoid guitar latency I finally went with a Kemper and commit to sound straight away.

Daws are amazing in the sense that you can produce and mix music that would have been impossible before without having multi million dollar studios. If you ”program” music or use loops then it’s fine but the recording process for instrumentalists has generally become an annoyance and a big step back unless you use additional hardware. The day a DAW maker comes up with a seamless native recording process without latency will be a big day!