Hi. This time, I wanted to write about understanding the LFE(Low Frequency Effects) and challenges I faced with the LFE channel in my initial days of mix and continued for some time until I figured out the reason. This is going to be a long post. But before that, let me give you a brief history of how I used to use it.

My Learning

My Initial days of mixing started with the concept that LFE was meant to have anything with Lows in it. Well, for a long time, I assumed that anything like the Kick or Bass would have to go that channel because, isn’t that the one for low frequencies? This was where my understanding of the way cinema sounds was limited, and as I learnt more, I felt it should be shared. The reason I say this is because even today, I see many beginners and engineers doing the same mistakes I used to make and not notice it.

The Difference and realization came one day as I was listening to a song in a movie I mixed while the Dolby Consultant was mastering it. Suddenly I felt that the low ends of the song didn’t sound right. For a while I thought it was my ears that were tricking me because of ear fatigue. (I had been working very long and I was an associate mix engineer then). I turned to my friend and Dolby consultant Bharat and asked him if he felt the song lacked any bass. Now, this is what’s good about having friends who are also consultants. They don’t bullshit, and they can be straight as a needle. Needless to say, he was sensible and said it didn’t sound like my other mixes and assumed I tried a different route. After a moment of contemplating, which was about a second long, sure enough, we load up the session and check.

I played that back and as I was expecting, the lows were back in form. This puzzled me. But I have to admit 2 things here. I was an assistant to a good mix engineer who gave me the freedom to take responsibility of the mixes I did. I also was very eager to troubleshoot. But, still, it became very hard. Finally after a few minutes, we played the mastered portion and it lacked bass. Now, Bharat asked me to invert the phase in the LFE channel in my mix and listen. Sure enough, it sounded low in bass, but was exactly the way the 5.1 mastered version played. But, then, as an idea, we inverted the phase on the LFE channel and then mastered. It sounded the right way.

Here, was when I learnt 2 of the most important lessons that changed my mix and my approach to it.

  1. The Mastering Process has a low pass filter on the LFE
  2. Always listen with the encode option on the Dolby Unit while mixing. (Though this will induce a 2 frame delay with regards to video due to the encode-decode process on the dolby DMU, this can be offset on the video track in Pro Tools.)

But what took me a few more years was to understand the explanation behind it. That came from another very good friend and live sound engineer, Niranjan. The learning was about filters and the mystery behind it.

The Reason

In the earlier note, you would have seen how inverting the phase in the LFE channel made it work. This led to me thinking about what the role of phase was. The truth, when it hit me, was simple yet profound. The LFE is routed to the Sub Woofer. The mistake I was doing and we sometimes still do was this. We have a send to the LFE in Pro Tools and if it’s a kick or a bass track or anything that needs that extra oomph, we increase that send. Now, in today’s mixes, it is very rare to print master, as before since all formats are digital and there is no encode-decode process. (It was for print and honestly; the print format is no longer very prominent. We all saw the pain Quentin Tarantino had to go through for Hateful 8). Now, because there is no print master, we want to keep the LFE channel that we record to be clean and have only the low frequency. Again, we don’t want the higher frequencies from lets say an explosion or a stereo perc loop that has good bass to go into the LFE channel. So, what do we do? We insert a low pass filter. And to get the cleanest sound, we put the highest slope. This is where the issue happens. Why? Because filters introduce a phase and since the same signal is present in the main channel as well as the LFE (they are coherent), they will interfere. (Remember that we are sending from the main channel).

Take a look at the video below to understand the above note. There is no audio to be heard, but you can instantly see what is happening.

 

This is the routing: 7.1 Center channel and the LFE are combined into a mono bus with 2 separate Auxes so that the Pan law doesn’t affect the signal displayed. (Pan law causes the signal to drop by -3dB or what is set in the session setup-Pan depth section.) This Mono bus is then shown on a Mono Aux to monitor. (It’s also to simulate the phasing issue. Else, we won’t be able to visualize it.) This Mono channel will represent exactly what happens in the real world when the LFE interacts with the Center channel in this case. I will be using a 120Hz sine wave for purposes of demonstration, though things are a little different with complex waveforms. Watch what happens when the LFE send becomes +3 dB.

 

How can we overcome this? From my understanding, there are 2 ways. (Please feel free to comment if you find additional techniques.)

  1. Have a separate design track for LFE
  2. Understand Filters.

I think we all have the first part sometimes. But, it is very useful to understand filters and their secret role in our daily life. My knowledge of filters isn’t very deep; yet, it is something that I did spend sometime trying to understand. Not from a software / electrical design point of view, but from a mixing point of view. This will change the way you will look at EQ and filters. To understand filters, we need to understand a few other things first.

Phase

What is phase? Phase is simply the difference in time vs. amplitude between two sinewave sources. This is the technical definition. But, here are the two keywords we need to keep in mind.Phase is simply the difference in time vs. amplitude between two sinewave sources. One thing that I didn’t understand was, if it was the difference in time, why was it measured in degrees? To quickly realize that, look at the figure below.

PhaseCycle

Phase Cycle

 

We all know that the waveform is amplitude on a time scale. But, it can be represented like a circle too as above. You will notice that if the circle moves ahead, then, the waveform will shift, like a time delay. But, the interesting part is, no matter what the radius of the circle is, whether the circle is big or small, the angle will always be at that same point on the circle, ie, 90 degrees, 180 degrees, etc will be on the same point on the circle, but different on the time scale. This means that the frequency can be entirely different. Yet the degree doesn’t change, even though the time for that will change. This led me to understand that Phase has no meaning without the frequency it is associated with. But how is this useful to us? Well, before I go into the LFE explanation, let me try it with a simple recording technique that some of us are familiar with.

If we have miked a snare on top and bottom, experience would tell us to invert the phase on the mic below because the pressure on the diaphragm of the bottom mic will be the mirror image of that on the top. But that’s not what I was getting at.

Calculating Phase

Phase can be calculated as:

Phase angle (deg) φ = time delay Δ t × frequency f × 360

Don’t get worried about the formula. I will use the above snare example to explain some stuff.

Snare

Snare Miking

Now, notice that there are two mics on the top and the bottom, and the time of arrival towards the top and bottom is t1 and t2 respectively. This means the difference in time is t2-t1, which is our time delay, or Δ t. Now for ease of calculation, lets take that t2-t1=4ms (4/1000 second). (I know that may not be the case in real world, but lets assume so for now).

We all know that cancellation occurs at 180 degrees. So, if we put 180 in the equation above then

180 = (4/1000)Seconds x f x 360

This gives us the cancellation frequency as

f = 180 / (.004 x 360)

f = 125 Hz.

This is where the first cancellation occurs. Wait, what? First Cancellation? Yes. There will be harmonics that will cancel too. How? Well you remember the degrees were represented as a cycle in the first Figure? That means after one complete turn (360 degrees), the point of 180 degrees will reappear. This means the next point of cancellation is 180+360 = 540 degrees.

If you put this in the above equation, you will get f = 375 Hz.

Is there a pattern? Yes. If you look at the initial time delay, that was t2 – t1 = 4ms. Now, the fundamental frequency that has a time of 4ms (.004 seconds) is

1/(.004) = 250 Hz.

If you look at the values of cancellation we got, they are at 125, 375, etc etc. This is 0.5×250, 1.5×250 etc. So, the rule is

Cancellation will occur at 0.5 fundamental frequency, 1.5 fundamental frequency, 2.5 fundamental frequency etc while addition will occur at 1x, 2x, 3x the fundamental frequency.

An interesting application of this while recording witih multiple mics, is if you know the tone you are looking for, adjusting the mic distance will help you get that. Ofcourse some of the lower frequency will require that you have the mic at outrageous distance. Well, a quick way would be to get to the closest cancellation length, and then invert that mic. Anyways, recording wasn’t my strong point, so I will get back to the LFE, but I hope you understood how phase degree relates to time.

In a mixing scenario, the same will occur if you take a signal of say 1500 Hz and delay it by 1 ms and add it back to the main signal. Here, 1ms is a phase shift of 540 degrees for 1500 Hz. This means it will perfectly cancel. So, adding delay times and using creative delays also can cause a change in the tonality. (This can also help create interesting sound design by adding multiple delay times to a vocal and summing it back. Like a phaser but with much more control on the frequencies you want to manipulate, now that we know the math! Adding this signal to a reverb or a stereo delay and blending it with some of the other methods I have suggested in the other blog posts and well, now that’s a ton! On a separate note, if you wanted a brick wall filter, then using the same principles, the delay needed for the signal to arrive because of the phase would probably be months!)

TimeDelay

Delay for Cancellation

 

Is the Phase always cancelling? No. It adds at different values too. Here is a phase template that plots these changes and will be handy to have as a reference.

 

Phase Template

Phase Cycle

Interesting Note: Why is this useful in DSP? Phase. To work on a frequency, the sample rate has to be twice of that. (Nyquist Throrem). For DSP, it must finish all computations during the sampling period so it will be ready to process the next data sample. The maximum time taken for an algorithm instruction execution gives the max frequency upto which it can sample. (Freq = Cycle / Time). So, using this, theoretically the HDX chip has less than 2μs to execute an algorithm at 192 kHz and this is where math optimization for plugins plays an important role. My knowledge of DSP is too poor to go into much more detail. But I assume a similar math is also what causes different audio engines on the native side of things to sound differently between different DAWs. It would be interesting to read more on that at some point.

 

What is a Filter

A filter is a frequency dependent amplifier circuit. It can exist as Low Pass, High Pass, Band Pass and Band Stop as in the figure below.

Types of Filters

Types of Filters

But, based on other parameters, there are different kinds of filters too. One common example is the Butterworth filter. It’s a 6dB per octave filter and has 3dB drop at cut off frequency.

This is again very important to know. If you put a 6dB per octave at 120 Hz for example, it means that 120 Hz will drop by 3dB on the output. My initial understanding was 120 Hz would be the same level and 240 would drop by 6dB as that was an octave more. But as I read, I understood better. This also explains why in the video above, the signal cancelled completely at a send level of +3dB.

As Slope increases, it means the order increases. So, 12 dB/ Octave is 2nd order etc. It has 45 degree Phase shift per order. This is also interesting. How are these filters actually working in the real world without plugins? Well, filters are constructed as shown in the diagram below.

FirstOrder

Simple Low Pass Filter. Image Courtesy Google. Apologies as I can’t find the source.

 

The capacitor is what acts as the actual filter. As the Frequency is increased, the capacitor charges and discharges. The capacitor is more susceptible to low frequencies to charge and discharge. This is called capacitive reactance. It is how the resistance of a capacitor changes with regards to the frequency of signal passing through it. (Basically, if I were a capacitor and I have to stop one person per minute at a gate, easy peasy. But if it’s an opening for a metallica concert and I am the only one there, well, what can I say? Get it? Is gatecrashing close to a capacitive filter?!? 😀 ). What happens is when the frequency increases, the charging and discharging time decreases and after a point, it will cause the signal to pass through the capacitor, acting as a short circuit. Look at the above figure, and you will see that Vout will become 0 because above a certain frequency the entire signal will pass through the capacitor to the ground. Now, charging and discharging takes time and causes delay. And as we all know this time delay in relation to the frequency will cause a phase shift. This is how filters cause phase shifts.

I haven’t gone to the Linkwitz-Riley and others because they are a bit more complex to explain. But, the Linkwitz-Riley is a combination of butterworth filters and the Pro Multiband uses an 8th order Linkwitz-Riley where the crossover phase is 360 degrees or 0 phase cancellation. This is quite amazing as no matter how you adjust the bands, there wont be issues in the phase. This is also the reason why this plugin can’t be used in parallel compression. Yet, if you use the Avid Pro Multiband splitter, rest assured that the signals will sum to 0 phase at the split.

But we were talking LFE

Yes. Here, is where the filter I mentioned in the first post makes sense. If that filter is at a frequency that is at a fundamental note of the bass or kick etc, that will begin to cancel with the signal from the main speakers. The easiest way to circumvent that, and my go to weapon is the Pro Subharmonic from Avid. The logical reason is this:

  1. It generates waveforms based on the input signal.
  2. It gives you access to the gain of Half the input signals, eg at a range of 60 and 90 as shown in the figure, it gives control at 30 and 45 hz. This also means that it is somehow generating half waveforms, thereby constructing true harmonics that would make sense in the program.
  3. It cannot cancel with the main signal, as for signals to cancel, they need to be coherent. Since this is generating signals, there is no way the LFE will have exact signals as the main channels.

There are others out there like the LoAir, Aphex Big bottom, and the hardware DBX 120A etc. For some reason, I like the way Pro Subharmonic works in the lower frequency region and also it works from Mono all the way to 7.1 and is AAX-DSP too. Understanding these things helped me with the way I started looking at filters and the LFE in a different light and also choosing what I had to send to the LFE and the very important question of IF I need to send at all. Not just that, remember, EQs are in a way filters too. So, they too have a similar way of working. But that’s probably for a different post. For now, let’s break it down with LFE!

-FM