Minor update — just in case anyone out there has additional thoughts and contributions to make on the subject.
Following 'technical' commentary is derived from discussions with one of CTV's chief engineers and with one of Shaw Media's (Global's) senior directors of engineering and broadcast systems.
With respect to the digital audio attenuation issue (apparently universal):
This is a puzzling problem which the above individuals have advised needs to be resolved/explained.
The baseline loudness setting ("dial norm") is the reference point from which the Dolby processor in the receiver aligns.
Shaw boosted the audio 6db to get them closer, but that "non-standard" setting caused all sorts of problems for BDUs.
The issue has been discussed with Dolby Labs who build the AC3 audio system used in DTV transmission. Their comment is there "shouldn't" be an imbalance if everything is correct but that there "often" is, specifically with the DTV audio being lower.
Dolby did explain that although in theory the audio level should not attenuate with digital compared to analog, it usually does. Dolby Labs did not implement this very well. Dial norm intentionally lowers the audio. This may not mean anything, but if you put in -20dbFS and set a dial norm of -24 (recommended in the ATSC standard), then decode that back to PCM audio, you will get -27dbFS audio, or 7db lower.
Since -3db represents half the energy, 7db down is actually quite a lot. In theory the TV and DVD recorder are "supposed" to compensate for this, but apparently that is where the "doesn't really work" part comes in.
Shaw Media have the same problem with their professional gear. If you feed "0vu" into a Dolby AC3 encoder, you only get -7vu out.