These Plugins are Ruining Mixing

When I first used some of these plugins, I was blown away by how useful they were - but as time’s gone on and I’ve thought more about them, I’m starting to notice these plugins are ruining mixes.

If you’ve been using modern intelligent plugins, or you’re curious about them and are thinking about buying some, stick around. This video’s going to be a little different than what I normally do, but I want to share some important info about these plugins so that the mixes and projects you work so hard on can sound as good as possible.  

What are intelligent processors?

Intelligent processors typically take the form of some AI processing or a plugin with a complex behind-the-scenes algorithm that determines how your signal will be processed.

For example, Gullfoss EQ uses a hidden preset frequency response to determine how it should attenuate or amplify various frequencies.  The idea being, the signal you feed into it will constantly be shaped to create a balanced sound.

And if you’re like me, when you first either saw or tried this, you thought, this is amazing, this will save me so much time.  But after using these for about a year, I can say with all honesty that these plugins have been making my mixes worse.

So, let’s talk about why.

Watch the video to learn more >

They Make Everything Dull

Whether it’s the Gullfoss EQ, Soothe 2, or Izotope’s AI assist, I’m noticing how dull they’re making mixes.

For example, Soothe 2 measures and attenuates resonances that have become too aggressive - but what is too aggressive?

Well, too aggressive or unbalanced is determined by the developers of these plugins - not by you - the person mixing the song.  Now, I’m sure their algorithms are well-designed, but that doesn’t mean they’ll actually augment your mix.

For example, say I have a vocal that has resonances in the mids - any good mixing engineer will tell you, don’t EQ these out.  You can adjust them, but trying to notch every unique aspect of the vocal that doesn’t fit into a preconceived notion of “balanced” is a bad idea.

Why? Because removing unique aspects of the vocal does not take into account the interaction of the vocal with other instruments, or how a particular genre allows vocals to behave in distinct ways.

A rock vocal can have distortion in the mids that sounds great, but what does Soothe 2 think? It’s going to remove them to recreate its predetermined balanced sound.

So what we get is combative processing - in which one processor is trying to take out what another adds, and in the end we get either an over-processed sound, or worse, a boring sound.

At least soothe 2 lets us adjust the emphasis bands, which can help us avoid unwanted attenuation - but this problem is even worse with something like the Gullfoss EQ that offers much less control.

Watch the video to learn more >

They Make Decisions in Solo

By that, I mean they measure only the individual signal, not the signal combined with everything else.

To be fair, these plugins allow for side-chaining signals, which lets the plugin measure competing signals and, more or less, make decisions in the context of the mix.

But few people use these plugins in this way.  More often than not, they’re slapped on a signal, adjusted slightly, and left as is.

So, without an external signal, how does the plugin take masking into account? Does its preset algorithm determine how multiple frequencies in the mix will interact? Or is it simply adhering to its algorithm?

If I mix a vocal soloed, then add it back to the mix, it will not sound the way it does alone - so how accurate can the plugin’s measurement be if it doesn’t interact with all relevant instrumentation?

The only way the plugin can accurately determine how frequencies will interact is by sidechaining the rest of the mix when affecting any signal.  So, if I was to use Gullfoss on the guitar bus, I’d need a stereo signal with everything except the guitars to make it’s pre-determined measurement accurate.

So maybe this could work when I have only vocals on top of a beat, but in other situations, I don’t see this being a reasonable way to approach a mix.

Additionally, if we do use it on a vocal and a side-chained instrumental or beat, we encounter the previous problem, that it can’t take the genre, the feeling, or any other intricate or complex ideas into account when adjusting the signal.  

I could be wrong, but listen to this vocal being attenuated with the beat side-chained - can you honestly tell me the processed version sounds better? Or is the vocal just tamed and made kind of boring?

Watch the video to learn more >

Sometimes, they Make Horrible Decisions

So far we’ve been talking about the Gullfoss EQ and Soothe 2 - 2 plugins that I ultimately think are well designed and can be very useful, but are just mis-used or used as if they’ll fix a mix’s problems without any additional effort.

But, when it comes to Ozone’s AI assist, the more popular I see it getting, the more I’ve felt the need to adamantly warn about how terrible it can make a mix or master.

For example, I was curious if it was smart enough to know if a track had already been mastered - so imported an EDM master, and clicked the assistant tool.

I promise you this was literally the first time I clicked it too, I wasn’t looking for an instance where it did something wrong - but it said the EDM track was Reggae and equalized it to match that genre, made it primarily mono, attenuated in-key frequencies that had previously given it a musical sound, and then decided to limit by an additional 10dB, resulting in attenuation up to 8dB at points, and an integrated LUFS of -5.

Nothing ever needs to be that loud, but that decision is made even more interesting since the AI Assist is supposedly optimizing the track for streaming services that use loudness normalization.

So, I doubt people are going to feed mastered tracks into this processor too often, but the fact that it can’t measure a track’s loudness prior to it introducing processing, nor can it accurately identify a genre, or make any decisions that make musical sense, makes me wonder why anyone would ever trust this plugin with something that they’ve worked so hard on.

Watch the video to learn more >

We Need to Be More Confident

These plugins have exploded in popularity - and for Soothe 2 and Gullfoss, I’ve previously said how great they were, how they helped salvage bad mixes that I have.

But, are they really helping? Or just making it so I don’t have to make difficult decisions when mixing?

The more engineers I talk with, the more I realize how many talented engineers completely underestimate how talented they are.  And if we’re constantly led to believe that are mixes are bad, of course we’re going to reach for a plugin that promises to fix everything for us.

But, your mixes aren’t bad - the unique aspects of them are, well, unique, they keep the production interesting, and ensure that things don’t all start to sound the same.  

All of this to say, next time you try to insert soothe 2, or gullfoss, or god forbid Ozone to use the assistant, ask yourself, does this plugin really understand my mix better than I do? 

Watch the video to learn more >