• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

24/192 audio recording is a waste of time and sounds worse than CD quality

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.79/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
It's not me saying this, but someone who's apparently an expert in this field. Unfortunately, I've never heard one of these high rate recordings, let alone compared it to the equivalent 16/44.1 one to say first hand, but I'm not convinced I agree with him.

This is all explained in great detail below and he makes a semi-convincing argument for the 192KHz sampling rate. One point I definitely disagree with him is that using a longer word depth ie 24 bits instead of 16 bits only increases dynamic range and not the resolution or "fineness" of the audio.

Imagine recording a 1KHz tone at the full dynamic range of both systems ie the full 16 bits or the full 24 bits - full volume. The second one will have 256 times the resolution (16777216 levels rather than 65536) of the first (resolution doubles with every extra bit). Do the same with music - use the full range of bits - and you have a much clearer and more detailed sound.

Apparently, the fact that dither is used makes the increased quantization noise of lower bit depth recordings effectively "go away" since it's converted into uncorrelated noise, which we percieve as white noise. Also, the noise can be pushed to other parts of the audio band to make it less noticeable. Thus, 16 bits is as good as 24 bits according to this argument.

Oh really? So, if you applied dither to a very low resolution 4 bit recording (just 16 amplitude levels) it would sound just as good as the 16 bit recording with 65536 levels, but with a lot more noise? No, of course it wouldn't. It would sound very rough indeed, highly aliased and unpleasant, much like those excessively compressed sound recordings one can find on the internet, usually accompanying very poor quality video.

And a big irony of recording at a higher resolution? A "big" improvement in sound occurs with the most dynamic range compressed "engineered to sound louder" rubbish, manufactured bands pop music recordings out there. Since the dynamic range is so limited, there are fewer amplitude bits used to encode it, so increasing the resolution gives you more accurate reproduction. However, garbage in fine detail is still garbage, lol.

In the past few weeks, I've had conversations with intelligent, scientifically minded individuals who believe in 24/192 downloads and want to know how anyone could possibly disagree. They asked good questions that deserve detailed answers.

I was also interested in what motivated high-rate digital audio advocacy. Responses indicate that few people understand basic signal theory or the sampling theorem, which is hardly surprising. Misunderstandings of the mathematics, technology, and physiology arose in most of the conversations, often asserted by professionals who otherwise possessed significant audio expertise. Some even argued that the sampling theorem doesn't really explain how digital audio actually works [1].

Misinformation and superstition only serve charlatans. So, let's cover some of the basics of why 24/192 distribution makes no sense before suggesting some improvements that actually do.

Read the rest at http://people.xiph.org/~xiphmont/demo/neil-young.html

This is still an interesting and informative article nonetheless.
 
Very very interesting read, qubit.

You caught me the same time I'm about to install a new car system :D

As a general thing I'm pretty satisfied with 16/44.1 and 24/48 so I think it can't get that better (if it gets better)

Gonna read it properly later :)
 
You're welcome radrok.

In fact, now that I think about it, I can disprove the assertion that 24-bit isn't better than 16-bit with my own experience.

My Acorn A3000 from 1989 had 8-bit stereo sound. Playing recordings, even 44KHz ones on it sounded quite aliased and absolutely nowhere near as good as CD. Note that it had a low pass filter on the analog stage which made the sound very muffled. I removed it, allowing the full high frequency response to be heard - along with the aliasing. In fact, a decent cassette deck on good quality tape sounded much better.

Also, Nero allows a CD to be ripped into different formats. One of those is 8/44.1KHz and again 8-bit audio sounds noticeably less good than 16-bit. Again, I could hear aliasing, although it sounded better than the A3000. Therefore, by extension, the same thing must be happening when comparing 16/24-bit recording. Of course, the difference here is the law of diminishing returns since CD already sounds so good, it's quite likely that it's very hard to hear the difference, but nevertheless it's very much there.
 
Last edited:
This is BS if you ever listen to a song form vinyl(those are written in 24bits and 192khz) and compare it to the same song from a CD(16bits 44.1khz) you'll notice that the vinyl sounds clearer and generally better.
 
Vinyl is analog, so doesn't have a sampling rate and bit depth, unlike what you've said.

Also, I've never heard vinyl sound better than CD, regardless of how good the system was. Anyway, this thread isn't a vinyl v CD debate, so let's leave it there.

EDIT

If you want to discuss vinyl v CD with me, why not start a thread on it in this section and we can debate it?
 
Last edited:
Vinyl has no relevance here.

Also Vinyl is a different kind of beast, it has a different "taste" much like how good aged wine compares to wine "novello" (we call it like that in Italy) ;)
 
Blue Man Group - Audio on DVD Audio. 'nuff said.

If the audio was recorded analog (vinyl) or at a resolution greater than 24-bit/192 KHz (The Crystal Method said they record at 32-bit) then there is a difference. The question is whether or not your audio system can produce that difference and your ears can perceive it. The album above is 6 channel as opposed to stereo and that, by itself, is a night-and-day difference. The difference also comes out on hi-hats using speakers that have high-frequency tweeters.


"Worse than CD quality?" Hell no! Whoever claimed that is so used to the lossy garbage that we've all been exposed to over the last decade and doesn't know what it is like to experience clean audio. Lets compare the raw data:
redbook_vs_dvd_audio.png


Redbook has approximately 10 times less data than DVD Audio.
 
Last edited:
Blue Man Group - Audio on DVD Audio. 'nuff said.
The article addresses claims like this by saying that the recordings are derived from different masters, with the higher resolution one using better gear to record the original sound, along with better mastering processes. He also cites confirmation bias as being significant here.

He claims that keeping all things equal other than the recording format, you won't hear any difference and backs up his assertion with a study performed on over 500 people which "proved" that one can't hear the difference.

Again, this isn't me saying this, I'm only the messenger, so shoot me. :p I'm also not convinced by his arguments without hearing it for myself and would have loved to have been part of that study.

One way to look at this is to ask yourself: is there any physical difference to the waveform coming from the speakers with the two formats. If there is and even if it's tiny, then it can potentially be heard, at least by some people and with the right gear (expensive).
 
Of course it's from a different master because the original masters predominantly are two channels and the example I gave has six. They had to re-record the whole album in a studio with no less than five microphones and then they further mastered it by isolating channels (especially subwoofer). It's foolish to pretend recording technology hasn't advanced because it has. This is why a lot of older artists have done re-recordings for compilation records.

...

Let's look at a different situation: TV. It underwent an HD revolution recently going from 480i (13.824 MB/s) to 1080p (186.624 MB/s). That's over a 10x jump just like going from RedBook to DVD-Audio. So why is it that the difference in HD TV is so obvious whereas it isn't in HD audio? Simple: our eyes are far better than our ears. As I alluded to previously, the advantage of DVD-audio isn't necessarily that you can hear the greater number of samples or the greater bit depth (that answer is going to be different for everyone and their audio systems) but that there three times as many channels for positional sound. It's like going from a conversation with someone in a sound-deadened room to having a conversation in an opera house.


Edit: Two more pictures...

This one compares 1 second of 24-bit @ 96 KHz 6 channel versus 1 second of 16-bit @ 44.1 KHz 2 channel:
audacity_1sec.png


This is the same thing but I zoomed in to the point where instead of a second, we only see 24 hundred-thousandths (0.00024) of a second. See the little | marks on the line? Those are the samples. Notice how the left has more than double the number of samples than the right, nevermind the extra channels.
audacity_24_hundred-thousandths_sec.png


Disclaimer: Left originated from AC3 lossy 5.1 DVD-audio converted into lossless RIFF Wave. The right originated from an MP3 rip off of the CD. They're in a similar position in the song but not exactly the same because the sources are different.
 
Last edited:
Ford, according to our expert, 192KHz results in poorer sound not because sampling more often actually causes degradation directly, but because there are more ultrasonics in the output, which tends to cause equipment to produce intermodulation distortion in real world equipment, along with "no benefit" in quality from the higher sampling rate since the waveform within the human frequency range is 100% recovered with 41KHz sampling, which can be proved mathematically, hence sampling any faster is irrelevant. He says that faster sampling is useful when mastering, to prevent aliasing artifacts from creeping in, but that it all should be downmixed to 41KHz (or 44KHz for DVD) at the end.

I don't really buy this argument though, but I have no experience to back up my argument, however. I just feel that good quality equipment is going to behave properly and not spew out crap, along with a better waveform from the extra samples due to things like jitter. Another reason that I think he's wrong is due to those ultrasonics. When you go to a convert, whether it be classical or rock, you're going to be subjected to ultrasonics from the high hat (symbols). Even though one can't directly hear them, I doubt they have no effect on the music and I have seen subjective reports which say the music sounds clearer and more sparkly when the ultrasonics are present. Hence, not capturing those signals with a higher sampling rate is actually degrading the sound. Heck do people produce ultrasonics from sibilants in speech? Especially the ladies. Again, if this isn't captured, then the recording isn't an accurate version of the original and one should be able to hear the difference.

Unfortunately, your example about the extra channels and re-recording music is comparing apples and pears here so isn't valid. He's not arguing that more can be done creatively with a multichannel format.

To fairly compare 16/41 with 24/192, one must compare identical recordings with this as the only difference and he's right here. Hence, only two channels are used for the comparison, along with the same mastering. His assertion is that one can't hear the difference between them, but I'm somewhat skeptical of that. And again, if the final signal reproduced from the speakers is not the same, then there's a difference all right and it can be heard, even if it's hard to tell them apart.
 
He says that faster sampling is useful when mastering, to prevent aliasing artifacts from creeping in, but that it all should be downmixed to 41KHz (or 44KHz for DVD) at the end.
Red Book (CD audio) is 44.1 kHz. DVD Audio supports 44.1 kHz, 48 kHz, 88.2 kHz, 96 kHz, 176.4 kHz, or 192 kHz at 16-, 20-, or 24-bits per sample.

Unfortunately, your example about the extra channels and re-recording music is comparing apples and pears here so isn't valid. He's not arguing that more can be done creatively with a multichannel format.
My point was that 96 kHz can catch 216% more variations in frequency than 44.1 kHz and 24-bit has 25,600% greater precision than 16-bit (far less rounding). No matter how you shake it, it is clearly superior. What the author questions, and rightly so, is that the inputs (recording equipment) and outputs (playback equipment) may not be superior and that's on a case-by-case basis that can't be covered by a blanket statement.

To fairly compare 16/41 with 24/192, one must compare identical recordings with this as the only difference and he's right here.
There is only one way to make a fair comparison between two audio formats and that is by way of digitally constructed sounds. Even so, there is bias because either quality was lost due to rounding going from 24-bit to 16-bit or the quality never existed ging from 16-bit to 24-bit (the same applies to samples). "Apples to apples" literally does not exist because we're ultimately dealing with analog inputs (instruments) and analog outputs (speakers).

The only truly fair comparison I can think of is comparing 16-bit 44.1 kHz, 88.2 kHz, and 176.4 kHz starting with the 176.4 kHz stock and playing either every second or every fourth sample to simulate the lower sample rates. It would have to be mastered in a state of the art recording studio and played back to a test audience by studio-quality speakers.

Edit: 176 kHz is available here but I don't know how they were produced:
http://www.audiocheck.net/testtones_highdefinitionaudio.php
 
Last edited:
24/192 audio recording is a waste of time and sounds worse than CD quality
I hope they made sure all the hardware and drivers used supported this 100% before coming to this conclusion.

I am sure this would require high-end hardware at minimum, from mic to speakers.

EDIT:
Just my stupid opinion.
 
Oh duh! - my 41KHz there. I meant 44.1KHz for CD, sorry for frazzled brain. :laugh:

I agree about the blanket statement. While the maths may show that a perfect waveform can be recovered, we know that real world equipment will suffer from things like jitter, which have most effect on the fastest part of the waveform slope ie the zero crossing point. Sampling faster helps to alleviate this and likely other things I can't even think of here.

Have a read of his description and the summary he linked to of the listening study made with those 500+ people (the full paper is behind a paywall unfortunately). They constructed the CD rate on the fly from the 24/192 signal. And as you've noticed, the CD sampling rate of 44.1 doesn't divide evenly into 192, thereby introducing aliasing artifacts that wouldn't be present from a true 44.1 recording. Even so, apparently no one could tell the difference, which I find hard to believe. Also, I'm not sure that simply throwing away samples is a valid way to recreate a lower sample rate from a higher one to prove his argument.

Think about how pictures look when this is done: horribly scrunched. No, the picture has to be interpolated down, with there being many interpolation algorithms out there. What this tells you is that the analog signal must be sampled by two sets of ADCs, one at CD rate and the other at the higher rate for a true comparison. Of course, this would suggest that a difference is more likely to be audible than if done properly with two sets of ADCs which actually strengthens his argument!

Thanks, I'll check out those 176KHz test tones.
 
I'll check out those 176KHz test tones.
Just make sure the software you use to play will not down sample the audio before playing it back. This is very common (even if your hardware supports it), especially on Windows programs...

Just my stupid opinion, again.

EDIT:
And don't use EQ or apply any audio filter. My stupid opinion again.
 
I resampled 176.4 kHz "pink noise" to 88.2 kHz and 44.1 kHz but I couldn't hear any difference because it's just noise. I put them all together and well, a picture is worth a thousand words:
all_together_now.png


I did try comparing 32-bit float to 24-bit PCM and 16-bit PCM but Audacity didn't show the difference because it likely converts everything to 32-bit float.

Edit: I doubt any speaker can move fast enough to produce that resolution and I also doubt the human eardrum can move that fast as well. Even so, the data is clearly there in 174.4 kHz and it is only vaguely represented in 44.1 kHz.

Edit: ...maybe planar magnetic drivers can move that fast...
 
Last edited:
Blue Man Group - Audio on DVD Audio. 'nuff said.

If the audio was recorded analog (vinyl) or at a resolution greater than 24-bit/192 KHz (The Crystal Method said they record at 32-bit) then there is a difference. The question is whether or not your audio system can produce that difference and your ears can perceive it. The album above is 6 channel as opposed to stereo and that, by itself, is a night-and-day difference. The difference also comes out on high-hats using speakers that have high-frequency tweeters.


"Worse than CD quality?" Hell no! Whoever claimed that is so used to the lossy garbage that we've all been exposed to over the last decade and doesn't know what it is like to experience clean audio. Lets compare the raw data:
redbook_vs_dvd_audio.png


Redbook has approximately 10 times less data than DVD Audio.

amen to that..
 
Norway 2L label sells 24 bit/352.4Khz DXD. DSD64 and DSD128 are a growing trend in D/A converters these days.
 
I must say wow, currently streaming the 24/192 to my bit one and it's marvelous o.O

Poco Adagio right now.
 
The Nyquist theorem which is the math behind all digital audio states that you can capture any arbitrary waveform to 100% confidence up to the frequency which is half the sampling rate. This means that any waveform in the 0-22 kHz range is fully captured with 44.1 kHz sampling rate.

As the article explains most vinyl nuts people explain digital audio as a stepper diagram or that anything between two samples is an interpolation between two points which is anything but the truth.

The fact is that the math says there is no difference, Boston Audio Society cannot hear the difference, I cannot hear the difference and my blind tests with friends revealed that the only difference is the master.

Vinyl is just a curious way of introducing artifacts to the recording and coloring the sound due to the inherent issues with the technology.

Monty at Xiph absolutely nails it with the article which is easy to understand and provides a wealth of good information and sources.

24-bit recordings had some merit back when the Windows volume control was crap in windows xp. Now it is a 32-bit design which means you can use the volume control without it hurting the sound quality.
 
I'm downloading Mozart's Violin Concerto 5.1 FLAC.
Boo! They over powered the mic in a few places. It could also be a down converting artifact. It sputters in the recording. :(
 
Back
Top