Personal View site logo
Very good site about loudness
  • Main site parts:

    • Introduction to Loudness – 101 Loudness explaining the basics in an easy-to-read, easy-to-understand fashion.
    • Broadcast Standards –What are the differences between the current standards? What are the significant details that are important to be aware of as a broadcaster or a deliverer of broadcast content?
    • Applications –It is one thing to know the technology and the standards involved, it is another to know how, where and when to apply the technology in real-world situations.
    • Events – TC has hosted several loudness seminars and have been present at broadcast and pro audio shows all over the world. An 'Events' page sums up these activities and is also a valuable source for in-depth videos.
    • Literature and Glossary – TC provides a massive collection of white papers describing and discussing all kinds of aspects of loudness technology, and finally there is also an extensive glossary explaining terms related to loudness.

    Check it out - http://www.tcelectronic.com/loudness.asp

  • 20 Replies sorted by
  • Superb, @Vitaliy_Kiselev, thank you. The Katz paper on mastering is fantastic, http://www.tcelectronic.com/media/katz_1999_secret_mastering.pdf and that's one jewel amongst many. Really interesting stuff.

  • Great stuff. Of course this is essential info for sending material to broadcasters, but I wish there were more standards for web content as well (or at least and ethic). I'm so tired of watching stuff on Vimeo and Youtube only to get blasted by ultra-compressed super-maximized completely dynamic-less audio. That's where we're losing the loudness war.

  • Yep, @MirrorMan, either that, or the (sort of) opposite: really crappy distant miking of speech.

  • Good thing to understand that after you check average speakers and headphones used for youtube and skype, you have real intention to compress it to hell.

  • You can't really be responsible for what people do once something has left your control. I had some experience of that working in old-fashioned analog broadcast radio: I was able to dial up receivers from around the world, and listen to how my transmissions were being received. It was always pretty depressing, particularly as the final transmission compression was out of our hands.

    But it was great to have an instant comparison between what was leaving my studio and what was being heard by my audience, which meant that I could experiment (a bit!) in real time. I learned that if you have good sound to start with (for example, really well-recorded speech with mics that are properly positioned), it will still sound pretty OK after it's been mangled by various things along the way between me and the listener at the other end.

    I've noticed a similar thing with video too - sometimes you will come across a very low-res video on youtube which shouldn't look great, but does. Probably because nice lighting and good composition still looks nice even when it's been mangled by YouTube.

    With both audio and video, I think the other thing worth experimenting with (and reinforces the point about Skype etc) is to try out your stuff - with audio, record it and play it back on ipods, in the car, on a TV etc. Similarly with video - play it over livestream, youtube (various resolutions), on ipods, TVs, PCs, whatever you have access to. A bit like my audio experiments in live radio, you will learn what looks good when it's been through the mangler. It's much more realistic (because it represents more of your audience) than just producing stuff to look good on a cinema screen!

  • Btw, it is also very interesting. I mean why Youtube compression is so shitty usually.

  • -23 the number that is burnt to my eyes daily

  • You right @Vitaliy_Kiselev. Although its understandable that people will use their ipods, internal computer speakers or cheap headphones for playback. Convenience nearly always wins out. But that doesn't help the loudness situation. Increased compression and loudness means loss of dynamics and people get used to that. So shitty becomes the standard. I guess whoever engineers Youtube codecs knows this. It gives them a license to skimp on the quality (except now we get to "choose" a 1080p version if its available ;)

    Funniest thing is I've heard people complain that Vimeo is a high brow snobby site because "their videos take so long to load".

  • I think that if the major of laptops had better speaker/amplification setups, there would be even less justification for over-aggressive mastering. It's frustrating how much you have to struggle to hear quiet passages on many such systems, especially if there's any background noise. And if you are carrying around a laptop and want to play something for your friends, you don't generally pull out a y-cable and split headphones, so it remains a practical problem.

    Some mastering engineers are better at handling the problem than others. One of my favorite examples is how Richard Dodd did a really good job of mastering Andy Hunter's electronica EP "Life" in 2005. Even with MP3 compression, you get a surprisingly good compromise between loudness, dynamics and separation and the songs sound/feel appropriately different without feeling jarring - compare "Lifelight" or "Wonderful" to "Come On". The original tracks had a lot of saturated saturation in the instruments that helped the perceived loudness to begin with, but poor mastering could have ruined that release.

    But in terms of online audio that drives me nuts... MySpace. I've said it before and I will say it again: why 22 KHz instead of 44.1 KHz? And why not at least warn the artists? C'est la vie.

  • The newer DAWs have built in, on-the-fly loudness metering that conforms to the newer International and European standards (EBU R128 and ITU-R BS.1771) http://pro.magix.com/en/sequoia/whats-new.529.html

    These are replacing peak metering, although of course you need to see peaks as well.

    Re Katz article: I would never, ever master a project using those techniques, and several of the seriously important, essential, can't-master-without-them techniques are completely missing from the article. Sorry to have such a strong opinion, but that's my 2 cents.

  • @DrDave do you have some links to other mastering resources? and thanks for the tidbit about the new DAW implementations...

  • There is so much on the web. But it is scattered.

    If I had to list the nine biggies....things that make a real difference in the final sound.

    1. Basically, the image is very, very important which involves the placement of the mics and then the related timing of the mics in the mastering stage. Plus the ratios and panning. But timing is crucial.

    2. You need a direct path to the monitors bypassing any and all computer junk (I use RME+ASIO, but there are many solutions.

    3. Layered physical modeling, including separate spaces for individual tracks with layers, as appropriate, and mixing algorithm based reverbs with impulse based ones. Don't overdo it. Make sure your effects are stacked in the right order--very common mistake.

    4. Don't use compression; use parallel compression or adjust the sounds with pullups, splits and volume. This is the difference, quite simply, between astroturf and a garden.

    5. Pull-ups--I use fifty of these per disk during mastering. Absolutely vital to skim over the project and fix it at the mastering stage.

    6. Lexicon 96. Period. TC is a good company, and I can understand why the word "Lexicon" is not in their "Lexicon," (the Katz article) but, seriously. Really.

    7. Maintain a multitrack editing environment using four-point cross-cut editing throughout the entire process, then burn the master CD directly from the multitrack environment, using Powr3 or a custom dither. Make sure the timing (see No 1) is correlated to the cross points. Use custom cross fades, or get some from someone who knows how to make presets. DAO, not TAO for CDs. Using bit-matching, maintain a bit accurate copy for archival and comparison.

    8. If you don't have spectral editing at the track level, you must then make a 32 bit master (no dither), run it through spectral, then set your master.

    9. maintain the 32 bit master for archival purposes and for the growing ultrafi market. Also consider using two converters, or just go with HD.

  • @DrDave thanks for this. I know you'll get a different opinion from every engineer, some more technical than others, but every angle helps.

    Re:"Make sure your effects are stacked in the right order...", I've wondered about this. A quick search yields a few pages that discuss mastering chains for specific software, but not much by way of general rules.

  • Let's say you have a physical modeling space for a solo instrument or voice in a large mix. You want to apply any reverb or physical modeling to that instrument before the final layer of reverb, or, with the voice, if you are de-essing or de-popping places you want to to that before the indivdual reverb channel, and that layer then is before the final reverb layer. Otherwise you add reverb to the ssssss sound, which is basically on half of the recordings out there. Also if you are adjusting the track timing on a percussion solo spot, you have to do that before any reverb or you will scatter the sound. If your drum mic does not have the correct delay plugged into your DAW track, you will lose the clean, crisp sound. But if all of your timing arrival points are mathematically perfect, your image will have no depth, and your tracks will not interact correctly with your reflections settings in your room simulators and reverb units. The mastering round allows you to tweak all of the settings until the image snaps into focus.

    In the Katz article, you won't even find the words "spectral" or "Cedar", but if you aren't using one of these, then you don't have the basic mastering tools. He talks about using the "Finalizer"--part of TC hardware--to master the image. I didn't use it 12 years ago, and I don't know anyone who does now. Basically, times have changed, and you can't overly fault an outdated article (well, maybe), but twelve years ago I was using a Lexicon 480L and a PCM 90, like most mastering engineers, or the TC 5000 and Sys 6000, and so on, and then rapidly moved to the convolution and plug-in reverb models, along with the newer Lexicon hardware.

  • Or the nicer sounding Bricasti (IMHO)

  • @DrDave These are fantastic - thanks for the link. I've been collecting my own recordings of impulse responses of castles and cathedrals etc, whenever I visit one and have a recording machine handy, but this is a great collection and really useful to add to the ones I already have. I miss the really good Worldwide Soundspaces site, which seems to have disappeared from the web. It had some seriously wonderful ones - such as the insides of industrial gas-tanks.

    As @MirrorMan says - good to have lots of opinions. I notice neither @DrDave nor the article mentions, for example, checking the mix in lots of different environments and on lots of different equipment that the typical end-user might be using. That's crucial. My last CD was designed to be played most often to groups in large community centres as well as sometimes in cars and only rarely on CD players at home, and once I'd tried that one out in various environments it led me to change a few things. That's probably more mixing than mastering, but still. There's such a huge amount to sound! And we've all come to it via different routes and with different experiences.

    Anyone reading all this might wonder why we're all so passionate about "interfering" with audio, but I absolutely agree that the end-listener needs "help" from the eventual audio mix. It's not about "realism". Alas, perhaps! It's tempting to go totally purist, but years ago I worked with a blind person who really struggled with producing mixes that others would accept. Not what you'd instinctively think - her mixes were totally realistic but lacked the subtle pointing-up of clues which go into a really good end-product. It made me realise the importance of producing a mix that gives the end-listener a bit of subtle help. It's the audio equivalent of tweaking curves and looks.

  • @DrDave got em and still not the same as the M7 made my own set for lugging around from my box and never quite there but not bad

  • Of course you need to check the mix in the car, ipod, youtubem etc, but after a hundred or so you learn where the reference speakers are in relation to these, and I also think the technology is changing too fast now, so I lean on the reference speakers much more than in the past.

    Samplicity makes great impulses! For harp I do like the lex 96.......................... As far as realism, that is where the timing is really crucial.

  • Fark...and i thought video was compicated