Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I suspect the author doesn't know much about digital audio and theatrical audio.

The Dolby Digital signal is AC3, a lossy codec for 5.1 audio. I believe it's the same AC3 that DVDs use. Bitrates (and audio quality) is roughly similar to MP3. Both were developed around the same time.

> I suspect audio data is stored uncompressed.

A little bit of math: 96 squares a second, at (assumed) a 48000 / second sample rate correlates to 500 samples / block. If it's 5778 bits a block, (and that's a strong if,) that's ~1000 bits / block / channel, or ~ 2 bits a sample. There's definitely a LOT of compression going on.

> I couldn’t find any public references to the dolby matrix encoding format

The author is confusing terms here. "The Dolby Matrix Encoding Format" is what's going on in the analog audio tracks. In audio, "matrix encoding" is how multi-channel audio is derived from two-channel audio. The analog audio is "Dolby Stereo," where a 5.1 mix is "matrixed" into a 2-channel recording, and then "de-matrixed" on playback. (In theatrical audio, "stereo" means what home audio calls "surround" or "multichannel.")

From https://en.wikipedia.org/wiki/Dolby_Digital#Cinema A constant bit rate of 320 kbit/s is used.



You sure Dolby Digital is that low of quality? From what I understood it was lossy compression but mainly limited by the bandwith of things like optical cables and could approach some pretty decent levels. Just because it might be at a similar sounding bitrate does not mean the quality is the same.

edit: This explains the differences pretty well - Dolby Digital (original version) can go higher than 320kbs

https://ottverse.com/mp3-aac-ac3-wav-wma-opus-audio-codecs-d...


From my experience, "low quality" MP3s had little to do with limitations of the MP3 codec. They came about because whoever encoded them chose low bitrates, used a bad encoder, or chose bad settings when making them. And, typically whoever was encoding an MP3 wasn't a trained professional.

In contrast, AC3 encoding was performed by professionals who understood what they were doing.


How does professionalism help if they don't have any choice in parameters? AC3 streams on DVD are constant bitrate and there's only one encoder, as far as I know. It's technically fairly similar to MP3, and mostly gets away with that by being used at higher bitrates.


Stereo AC3 encodes for DVDs were commonly 128kbps with some in the 224kbps-256kbps range with 5.1 surround in the 320kbps-384kbps range. IIRC, AC3 could go to 512kbps, but rarely did that happen as it took away bandwidth from the video. All of the video/audio/subtitle streams in DVD VOB had a total bandwidth of 9.8Mbps. If you had multiple 5.1 @ 320kbps and a default stereo as 224kbps, you've already robbed just under 1Mbps. Subtitle streams were small, but added up as well when supporting multiple languages. Avg desired bitrate for video was 8.5Mbps, but with overhead and all of the audio, it was often much lower. All of that math is just for the demuxer and does not take runtime into consideration.


I am mainly thinking about this for modern sources - I haven't run a DVD player in years and years. But I do have a Roku with a Dolby Digital hardware encoder built in that is encoding anything 5.1 from Netflix/HBOMax/Amazon etc etc to Toslink to send to my somewhat older receiver. It sounds pretty damn good so I am hoping it is defaulting to the highest bitrate it can but who really knows. Have my Chromecast Audio also using Toslink which I assume is doing stereo AC3


TOSLINK is just an optical connector system, it's not a system for encoding. Just like "RCA" is a type of connector, and you can run whatever signal you want.

Optical audio in consumer equipment is very limited. It seems like it should be high-bandwidth because it's optical, but it falls short. For consumer audio equipment, ordinary RCA cables and TOSLINK optical cables do the same job, and the optical cables don't have better bandwidth or let you do longer cable runs.

You get two channels of PCM audio, or more channels of lossy audio. If you're running stereo or 2.1, I think it makes sense to configure your Roku to send a stereo S/PDIF signal over TOSLINK.


I understand how optical works - its just that my receiver only has optical or RCA. Early 2000s receiver. Roku only has HDMI out and Toslink. Trust me - definitely the best (and I believe ONLY way) to get 5.1 to this thing is using Toslink. No S/PDIF connections on either device.


"No S/PDIF connections on either device."

That's because S/PDIF isn't a type of connector. It's commonly carried over coax cables (with RCA connectors) or optical cables (with toslink connectors).


I had to switch to a TOSLINK cable for my soundbar. My LG TV and LG Soundbar are really meant to connect via ARC using HDMI cable. However, mine had serious problems with going out of sync and playing distored audio for 10-15 seconds before correcting itself. It was very very annoying. I switched to a TOSLINK cable and no longer suffer that issue.


Yea that is a somewhat common issue with HDMI and audio - if the device passing it through cant keep up then there can be sync issues. Or the source device can also be a problem (or an app on the source device).

I have found toslink and SPDIF to be both very reliable in this though (probably since they are such mature technologies at this point).

Make sure your TV is actually outputting the correct format over Toslink to your soundbar - many times they default to 2.1 instead of 5.1 or the wrong format (Assuming you have a soundbar with more than just 2.1). A lot of TV's also downmix to 2.1 audio if you pass through a source to them and then use the Toslink output of the TV to a receiver (like say if you did Apple TV --> TV --> Receiver)


> Early 2000s receiver

What brand? It's impressive when they last that long.


I can't remember ever having a receiver fail. Every receiver I've owned still functions. If I visit my dad, he still uses a receiver that he got in the 1980s.

Older receivers (1980s) can blow an output transistor, which is a pain (fixable if you are skilled). Newer receivers are often built with integrated power amps that have extensive protection circuitry... they're truly impressive, and hard to accidentally damage.

I've done some amplifier repair work. The amplifiers I repaired were typically much older (1950s, 1960s). The most common failure modes are fuses blowing, potentiometers failing, capacitors failing, and PCB-mounted jacks failing.


I had an Onkyo fail a little more than a year ago. It was probably ~7 years old.

I suspect that it didn't have enough airflow, though.

Anyway, I was ready for ARC2, so I just replaced it with a Sony. I'm extremely happy with the new one.


I still have a 1970s model head unit that sounds just as good today as when my dad passed it down to me as a teenager in the 90s. The thing I loved about it so much is it had A, B, A+B mode so that it could drive 4 speakers. It wasn't quad, but I could put 2 sets of stereo speakers in different rooms.


Is that a Bang & Olufsen? They commonly had a similar system of main and "extra" speakers.


Yamaha


First, I wouldn't call MP3 "low-quality". There are some limitations to MP3, and it has lower fidelity than other codecs at the same bit rate, but MP3s are crystal clear when done right.

AC3 has higher fidelity at the same bit rate, and can be used at higher bit rates, but the difference between AC3 and MP3 is not exactly night and day. AC3 is roughly similar to MP3 in terms of quality and bitrate, but that's not a bad thing.

As mentioned elsewhere in the thread, the audio format in cinema wasn't technically AC3 but something quite similar.


MP3 has fatal flaws on some audio and can't compress it transparently at any bitrate - notably happens with cymbals. Some of this is because there's a maximum bitrate above which it's technically out of spec.

There is a reason we don't use it anymore. AAC/Opus don't have these problems.


Valid reasons... but definitely not "low-quality".

When we're talking about "transparency" here, we're talking about finding small percentage of samples that can be ABX'd by people who are specifically listening for artifacts. For plenty of source material, you get transparency with MP3.


Yes, for almost all source material this is true. Unfortunately there's definitely killer samples out there that can be distinguished at 320kbit latest LAME with no effort at all.

Also, rather than increasing complexity I think the newer better codecs are actually simpler than MP3, since we learned which parts of it helped and which didn't.


Dolby SR, the 2-channel analog format on the film, isn't 5.1. It's 4.0 with Left, Center, Right front channels and Mono surround.


Dolby Surround aka ProLogic, was later enhanced with Prologic II which provided a higher bandwidth and rear left+right matrix.


Was that ever released for film use? I don't recall seeing any prints referring to ProLogic or ProLogic II during my time as a student film projectionist. If so, it was highly backwards compatible since our decoder was only SR (analog 4.0 matrix).

The prints would generally come with SR (analog 4.0) and SR-D (Bitmaps between the sprockets). Most of the time you'd also get DTS CDs and about 40-50% of the prints we got (in Stockholm) had SDDS, though I think there were maybe like 5 SDDS theatres in Sweden.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: