• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

An "Audiophile Grade" SSD—Yes, You Heard That Right

Joined
Jan 28, 2021
Messages
845 (0.71/day)
Except that you can think of the previous (async) step always as the storage media. The DAC doesn’t know when a certain bit was read from the storage media to cpu cache, to ram, to cache, to pcie, to usb controller, to usb receiver buffer, to i2s connecting the usb receiver to the actual DAC chip. Only the last step is actually timed.
Errors can still happen in that last step of the stream being read by the receiving chip in the DAC and while being processed.
Nope. There is just a single clock on the DAC, which is split up and combined to create any other (minor)clocks that are necessary.
Right they communicate via a single clock but the stream consists of several clocks, any of them are subject to error though.
For example the ARES II has a 10+ms buffer. Many other ”high-end” DACs have similar stuff.
Cool, so it is a thing, I'll have to read up on that.
Temperature controlled main clock source makes sense and can actually affect how things sound. I2S is used in even most sub 50$ DACs, just internally. It’s just a basic board level interconnect, nothing crazy. What’s crazy is trying to use it for something it was never designed for, and is not good for, like connecting a PC to a DAC.
Yeah I know the origins of I2S. Using it as an external interface seems crazy I suppose if you think USB async is without fault and that kinda seems like the majority of our disagreement here. The interface itself is more robust and putting each clock along with the dataon its own path would have tangible benefits in my opinion.
The size of a buffer makes no difference in the amount of work needed to ’keep track of and manage it’. You can utilize the same exact data handling code for buffers of almost any size, by just changing one input parameter.
That dosen't seem right to me. If you increase the size of CPUs cache latency goes up. If a DAC has to buffer more frames of PCM stream and keep track of them for the event in which it needs to use whats in the buffer rather than what was next in the stream how is that not more work for it to manage and keep track of the timing of these additional frames in the buffer? I mean this is happening at 44,000 times a second in the case of plebeian CD quality audio.
It is the same bits, they are just repackaged to differing lenghts depending on the tranfer interface. For example on USB the DAC just sends a request for the ’next n bytes of data’ and the cpu then fetches them from RAM or HDD, packages to the USB packet and sends it off.
If the bits would be different, it would sound different.
Well the bits being sent are the same but how they get there is whats in question. We already went over how a realtime digital audio stream is different than say transferring a file to a USB flash drive which is honestly beyond most peoples awareness of how this works so no need to go over that.

I get your points but something as fundamental as the async feature of USB Audio 2 it is essentially a technique that was added to USB audio to compensate for the problems encountered in a real time digital stream. If digital streams didn't have these problems async DACs wouldn't be needed. In USB Audio 1 (none async DAC) the bits being sent would be the same but the interface is at fault so the sound would be different. So either USB async DACs completely solve everything and things like I2S are waste of time or its just further down the path to mitigate the issues with digital streams.
 
Joined
Oct 15, 2019
Messages
549 (0.33/day)
Right they communicate via a single clock but the stream consists of several clocks, any of them are subject to error though.
What different clocks? Any data transfer related clock accuracy is completely irrelevant in async transfer.

Errors can still happen in that last step of the stream being read by the receiving chip in the DAC and while being processed.
Yes! The same goes for i2s, and literally any transfer protocol.

if you think USB async is without fault and that kinda seems like the majority of our disagreement here.
For non-realtime applications, I really don’t see any major faults with it. If I had the ability to change something, I’d maybe add some error correcting bits in the packet structure, just so that we wouldn’t need to have this discussion about how likely transfer errors are over USB.

The interface itself is more robust and putting each clock along with the dataon its own path would have tangible benefits in my opinion.
What are the benefits, in your opinion?
Comparing the two options of: ’async source -> DAC’ and ’async source -> i2s transmitter -> DAC’.
I really don’t see any, but maybe I have missed something.

That dosen't seem right to me. If you increase the size of CPUs cache latency goes up.
But the cache latencies are not a problem at all in this use case. They just add additional consistent latency to the overall signal chain. You can get constant latency memory chips, clocked to the DACs master clock.

If a DAC has to buffer more frames of PCM stream and keep track of them for the event in which it needs to use whats in the buffer rather than what was next in the stream how is that not more work for it to manage and keep track of the timing of these additional frames in the buffer?
The frames can be connected together, and reading can happen sequentally based on memory address. I guess programming isn’t your strong suite, if you think that the size of a data structure makes it more complex in itself.

I get your points but something as fundamental as the async feature of USB Audio 2 it is essentially a technique that was added to USB audio to compensate for the problems encountered in a real time digital stream.
It was added, so that the computers inaccurate clocks would be disconnected from the audio output. In USB audio 1 that was a real problem, and it was solved via the means of data science in usb audio 2. The only downside of that was the need to increase minimum buffer size of the DAC, making them less real time. For most use that does not matter.

So either USB async DACs completely solve everything and things like I2S are waste of time or its just further down the path to mitigate the issues with digital streams.
It solved everything for most users. Real time users have moved to thunderbolt. I2s as an external interface is just some audiophile marketing bullshit and ”solves” nothing.
 
Joined
Jan 28, 2021
Messages
845 (0.71/day)
What different clocks? Any data transfer related clock accuracy is completely irrelevant in async transfer.
All of the clocks that are present in I2S internally (the master clock, bit clock, word, clock, maybe more) are part of PCM and have to muxed and demuxed when they are transferred over USB because there is only one data line. Async helps keep everything in sync but its only mitigating the issue, it can't be perfect, therefor not irrelevant.
Yes! The same goes for i2s, and literally any transfer protocol.
What are the benefits, in your opinion?
Comparing the two options of: ’async source -> DAC’ and ’async source -> i2s transmitter -> DAC’.
I really don’t see any, but maybe I have missed something.
Of course nothing is ever perfect but giving all the separate data lines and clocks dedicated is more bandwidth and less interference for those specific components of the stream.
But the cache latencies are not a problem at all in this use case. They just add additional consistent latency to the overall signal chain. You can get constant latency memory chips, clocked to the DACs master clock.
Given how crucial clock accuracy is to the internal operations of a DAC (how fast data is coming in, the sampling rates involved) it would seem like there would have to be some performance cost adding more cache. If you are holding data in a cache to compare it to data coming in thats more work. I still have to read up on what the Ares II DAC does in this regard.
The frames can be connected together, and reading can happen sequentally based on memory address. I guess programming isn’t your strong suite, if you think that the size of a data structure makes it more complex in itself.
I'm not a programmer by any means. For reference I work on the infrastructure side of datacenter IT, specifically configuring high performance hierarchical database systems, and a bit on the SAN related side of said systems. They key difference with the type of data I'm working with or really any traditional data science is that error checking is always present before any data is considered good (this is covering old ground). When data is sent over a HBA via 50' plastic fiber cable in one of my servers there is no doubt errors happening all of the time, database dosen't get corrupt because the system taking time to compare the data to what it expects. The principles of binary data and how its transmitted are no doubt largely the same the error handling is not the same in real time digital audio stream.
It solved everything for most users. Real time users have moved to thunderbolt. I2s as an external interface is just some audiophile marketing bullshit and ”solves” nothing.
There honestly isn't any way to determine what it solves or doesn't from forum posts despite having a strong knowledge of the fundamentals of transferring digital data, inferring old interface (SPIDIF, USB1) have weaknesses vs new and improved interfaces (USB2) and concluding that its completely solved.

If it were just marketing bullshit its years and years of work to develop, test, manufacture it. Despite whatever you may think of its technological merits or goals developing any new product from scratch is hard let alone one that uses new and largely proprietary interfaces and protocols (in a new way). There are far, far easier ways to market bullshit than to go through all that effort especially in the audio world.
 
Joined
Oct 15, 2019
Messages
549 (0.33/day)
Of course nothing is ever perfect but giving all the separate data lines and clocks dedicated is more bandwidth and less interference for those specific components of the stream.
In the given example, could you point out how and where the different interference aspects happen. As you know, USB is more robust when it comes to interference than i2s (due to differential data lines).

All of the clocks that are present in I2S internally (the master clock, bit clock, word, clock, maybe more) are part of PCM and have to muxed and demuxed when they are transferred over USB because there is only one data line.
Umm, it’s just one clock. The word clock is just the data clock divided by 8(?) and so on. only the single master clock is a hardware level thing, the rest is done with software and some transistors.

it would seem like there would have to be some performance cost adding more cache. If you are holding data in a cache to compare it to data coming in thats more work.
Just no. You don’t need to compare data in cache to data you are receiveing. It’s just a simple FIFO data buffer.

If it were just marketing bullshit its years and years of work to develop, test, manufacture it.
Just waiting for someone to publish the tests that prove it does something. Any second now.
If no definitive benefits have been proven to exist, then why the hell has money been pumped into this?
There honestly isn't any way to determine what it solves or doesn't from forum posts despite having a strong knowledge of the fundamentals of transferring digital data, inferring old interface (SPIDIF, USB1) have weaknesses vs new and improved interfaces (USB2) and concluding that its completely solved.
Yup. And until someone determines with credible methods that something exists, then it probably doesn’t. Or at least we should not base things on the premise that it does.

edit: but yeah, we have very fundamental differences in the way we think about stuff. You seem to think that if something isn’t proven to not exist, that it might exist. I think that if something is thought to exist, it needs to be able to be proven.

That’s all. Proving negatives doesn’t really work for anyone, so it’s maybe best to tap out.
 
Last edited:
Joined
Jan 28, 2021
Messages
845 (0.71/day)
In the given example, could you point out how and where the different interference aspects happen. As you know, USB is more robust when it comes to interference than i2s (due to differential data lines).
Its ultimately a more robust cable and connector. 1's and 0's converted to voltage and back to 1's and 0's right? And there is no reason to go over again how a digital audio stream is different than how traditional data is transferred.
Umm, it’s just one clock. The word clock is just the data clock divided by 8(?) and so on. only the single master clock is a hardware level thing, the rest is done with software and some transistors.
Its really hard to find information on how audio gets encoded and decoded but its my understanding that there are several clocks used in PCM and I assume other formats all work in a similar way. The master clock is what is used to send / receive from host to device but thats irrelevant to how the audio is packaged.

Just no. You don’t need to compare data in cache to data you are receiveing. It’s just a simple FIFO data buffer.
Yeah, ok so thats kinda what I thought anyway.

I couldn't find any information about the Ares II (or any other DAC) having anything particularly special going on with its buffer.
Just waiting for someone to publish the tests that prove it does something. Any second now.
If no definitive benefits have been proven to exist, then why the hell has money been pumped into this?
Yeah, I mean I'm with you but who would certify such a test and who's the target audience. (digital) Audio enthusiasts that have a half way decent understanding of how data transfer works, and the differences in a digital real time stream (I'm pretty much at my limits in being able to talk about this from a technical perspective)? Its just a such a minuscule small number of people that would care or know what is being presented to them that I don't think it would make any difference one way or another. Most people are in one of two camps; its "just 1's and 0's all DACs sound the same" or are only interested in subjective interpretations of in the form audiophile vernacular.
Yup. And until someone determines with credible methods that something exists, then it probably doesn’t. Or at least we should not base things on the premise that it does.

edit: but yeah, we have very fundamental differences in the way we think about stuff. You seem to think that if something isn’t proven to not exist, that it might exist. I think that if something is thought to exist, it needs to be able to be proven.

That’s all. Proving negatives doesn’t really work for anyone, so it’s maybe best to tap out.
Yeah, I think we maxed it out, and I don't think anyone else is following along anymore. I'm not trying to be right here just learn.

I gave up on high-end audio like 8-10 years ago when it seemed like the consensus was a $150 DAC could be "bit perfect" if it was configured correctly and its all 1's and 0's anyway so anything beyond that point literally doing nothing. That notion never made sense to me but was insanely frustrating knowing what a $150 speaker sounds like vs what a $1,500 speaker yet all DACs are the same? Didn't really matter then as I didn't have the resources to buy anything much more than a $150 DAC anyway so kinda moot and no reason dwell on it.

Thats just a bit of context into motivation of my thought process. As to what exists or doesn't, ideally yeah prove it but most people either care about "how it sounds" and a much smaller subset just want "proof". It seems like most people that claim to be looking for proof are just looking any kind of data to prove their point that there is no difference and probably wouldn't be moved regardless of what was presented to them. I don't think there are enough people that both genuinely interested and know enough to have a conversation like we are here to really move the needle either way. Thats my cynical impression of the state of the issue though I guess.
 
Joined
Oct 15, 2019
Messages
549 (0.33/day)
Its ultimately a more robust cable and connector. 1's and 0's converted to voltage and back to 1's and 0's right?
It has no dedicated connector. Most use just a repurposed hdmi cable and connectors. Nothing amazing, or high end there. Specs for hdmi cables are not meaningfully different to usb. Electrically usb is superior to i2s, when it comes to interference handling.

Its really hard to find information on how audio gets encoded and decoded but its my understanding that there are several clocks used in PCM and I assume other formats all work in a similar way. The master clock is what is used to send / receive from host to device but thats irrelevant to how the audio is packaged.
Your knowledge is lacking, sadly. There is just one clock in pcm that is important, which is the sample rate. The others are either derived from it, or are arbitrary to the output.

I couldn't find any information about the Ares II (or any other DAC) having anything particularly special going on with its buffer.
It’s just bigger, that’s it. You can determine the buffer size by measuring output latency.

Yeah, I mean I'm with you but who would certify such a test and who's the target audience.
There are plenty of techical journals that publish data science stuff. If that’s a too high bar to clear (lol), then just publish it at your homepage for others to see. No tests means no improvement.

It seems like most people that claim to be looking for proof are just looking any kind of data to prove their point that there is no difference and probably wouldn't be moved regardless of what was presented to them.
Nice assumptions there. I for one would love for there to be new and improved ways for DA converters to work, and for nyqvists theorems to be proven false. It would be extremely interesting for many fields outside of digital audio.

consensus was a $150 DAC could be "bit perfect" if it was configured correctly and its all 1's and 0's anyway so anything beyond that point literally doing nothing. That notion never made sense to me but was insanely frustrating knowing what a $150 speaker sounds like vs what a $1,500 speaker yet all DACs are the same?
What makes sense is being able to prove that something exists or not. Non proven bullshit is what doesn’t make sense.
 
Last edited:
Joined
Jan 28, 2021
Messages
845 (0.71/day)
It has no dedicated connector. Most use just a repurposed hdmi cable and connectors. Nothing amazing, or high end there. Specs for hdmi cables are not meaningfully different to usb. Electrically usb is superior to i2s, when it comes to interference handling.
I know that its simply a repurposed HDMI cable and connector. Whatever the spec may be high quality, well built HDMI cables are plentiful and are better than USB 2.0 cables which is what USB audio uses, how it compares to USB 3.0+ I don't know. As to I2S all DACs internally work with with I2S so its specifically the separate clocks used in an external I2S interface that have dedicated conductors whereas with USB or SPIDIF it gets muxed together.
Your knowledge is lacking, sadly. There is just one clock in pcm that is important, which is the sample rate. The others are either derived from it, or are arbitrary to the output.
I know my knowledge is lacking, I said as much, I didn't don't have a degree in any of this. Having a strong knowledge of how digital data is transferred in the traditional sense, and how PCM works dosn't make you an authority on this subject either however. PCM is not the same as thing as I2S, but were developed at the same time as part of the development of CD audio (I think?... before my time).
There are plenty of techical journals that publish data science stuff. If that’s a too high bar to clear (lol), then just publish it at your homepage for others to see. No tests means no improvement.
I'd like to see that too but the cross section between high-end audio, and outlets that publish indepth technical studies and conduct proper blind tests is small, the target audience would be even smaller. 99% of the people buying this gear don't have slightest idea of how any of it works and don't care. I waste a lot of time on the internet (obvious at this point) and I can't think of anyone that would conduct and publish such as study.

No tests means just that, "no tests" not "no improvement". The tests have to get better to show the improvement or lack thereof.
Nice assumptions there. I for one would love for there to be new and improved ways for DA converters to work, and for nyqvists theorems to be proven false. It would be extremely interesting for many fields outside of digital audio.
Sure they are assumptions but without the proof you require thats all there is left.

It very well could all be marketing but the notion that people (engineers, project managers, ect.) are going to devote their education (in the case of the engineers at least), career, and thousands of hours to develop high performance equipment that makes no appreciable difference, makes absolutely no logical deductive sense. It could also be a shared delusion and the words biggest example of confirmation bias but for the same underlying reasons it seems unlikely, and the fact that regardless of what you think of their design objectives achieving them required a lot time and effort by people from a highly skilled technical background. Just get a job in the sales and marketing department for one of the many brands owned by Sound United if you want to make bank or find a real religion if you need to believe in something.
What makes sense is being able to prove that something exists or not. Non proven bullshit is what doesn’t make sense.
To you. Do you only buy wine or coffee thats gone through blind tests and proven to be the best? Personally I just buy what tastes good to me. Most people don't base their decisions on what has been proven in blind studies in general. Its even less practical in audio due to the possible variations and individualism of audio.

Like I already mentioned I would like to see some really good tets in this area. It should be doable on a smaller scale (it wouldn't be statistically meaningful though) with a few subjects on gear they are familiar with. Who do we talk to lol?
 
Joined
Oct 15, 2019
Messages
549 (0.33/day)
No tests means just that, "no tests" not "no improvement".
I disagree. If you, as a high end audio company, make claims about the sound difference of doing things differently, and are unable to show that in tests, there is zero merit to the different approach.
There could still be a difference, sure, but to which direction is it? No one knows.

It very well could all be marketing but the notion that people (engineers, project managers, ect.) are going to devote their education (in the case of the engineers at least), career, and thousands of hours to develop high performance equipment that makes no appreciable difference, makes absolutely no logical deductive sense.
Almost as much sense, as doing all the work and then ”forgetting” to test if it made a difference…


To you. Do you only buy wine or coffee thats gone through blind tests and proven to be the best? Personally I just buy what tastes good to me. Most people don't base their decisions on what has been proven in blind studies in general. Its even less practical in audio due to the possible variations and individualism of audio.
I do blind tasting of drinks to find out what I _actually_ like. It’s a fun hobby and shows how bias is king. And it’s anyway a poor comparison, as wine taste differences can actually be consistently detected by humans in blind testing, unlike differences between functioning USB cables.


Like I already mentioned I would like to see some really good tets in this area. It should be doable on a smaller scale (it wouldn't be statistically meaningful though) with a few subjects on gear they are familiar with. Who do we talk to lol?
The burden of proof should be on the ones making claims. Harman audio does a bunch of testing on audio related things, but tend to focus on actual things that could matter, like eq, speaker element types etc.
As to I2S all DACs internally work with with I2S so its specifically the separate clocks used in an external I2S interface that have dedicated conductors whereas with USB or SPIDIF it gets muxed together.
USB for the most part works asynchronously, treating the computer much like a file storage, and no timing information is transmitted over it. Spdif is as you said.
PCM is not the same as thing as I2S
I2s is a physical layer protocol for transferring data. PCM is a data format.
Whatever the spec may be high quality, well built HDMI cables are plentiful and are better than USB 2.0 cables which is what USB audio uses
How much better? Enough to get a better outcome, compared to more robust signaling and error handling of USB? Got any tests to link, or is this just an uninformed opinion?
 
Last edited:
Joined
Jan 28, 2021
Messages
845 (0.71/day)
I disagree. If you, as a high end audio company, make claims about the sound difference of doing things differently, and are unable to show that in tests, there is zero merit to the different approach.
There could still be a difference, sure, but to which direction is it? No one knows.
Yeah we disagree. Audio being subjective, people perceive it differently and the difficulty of doing meaningful tests again has already been gone over several times all make tests unpractical for a manufacture to carry out of each and every product. People prefer different things and its not always based on the price of the equipment. A manufacture like anyone else could do a closed test and show that people can pick out DAC A or DAC B but that would be purely academic.
Almost as much sense, as doing all the work and then ”forgetting” to test if it made a difference…
Forgetting? I mentioned several posts back how Schiit does their internal testing. I'm sure all manufactures do something similar.
I do blind tasting of drinks to find out what I _actually_ like. It’s a fun hobby and shows how bias is king. And it’s anyway a poor comparison, as wine taste differences can actually be consistently detected by humans in blind testing, unlike differences between functioning USB cables.
Personally I've never felt the need. I've tried several times to get into more expensive roasts of light and medium of coffee and always end up preferring the mid-range medium and dark roasts. I didn't say it was a good comparison but its a comparison. Auditory perception and memory recall makes any audio comparisons uniquely difficult.
The burden of proof should be on the ones making claims. Harman audio does a bunch of testing on audio related things, but tend to focus on actual things that could matter, like eq, speaker element types etc.
All manufactures in all industries do testing. Harmon is known for the Harmon Curve, they are known for that but I'm not aware of them doing anything more than that outside of publishing test results.
USB for the most part works asynchronously, treating the computer much like a file storage, and no timing information is transmitted over it. Spdif is as you said.
USB is muxing the parts of the stream the same way SPDIF is, USB being async dosn't have any bearing on that. I2S is different in that its using the different conductors and pinouts of a HDMI cable to separate those out.
I2s is a physical layer protocol for transferring data. PCM is a data format.
Right.
How much better? Enough to get a better outcome, compared to more robust signaling and error handling of USB? Got any tests to link, or is this just an uninformed opinion?
Honestly no clue. I'm really only interested in it from an academic point of view as to why it could impact that sound. My opinion is is uniformed to the extent I don't work in digital signal processing and analog circuits nor am I an expect in digital audio formats. Most people that post in these forums have a fairly strong understanding of how binary information is stored and transmitted but the few really know how a DAC works on even a basic level are often conflating the two which is understandable but more often than not leading to the wrong cluclusion in my opinion.
 
Joined
Oct 15, 2019
Messages
549 (0.33/day)
but that would be purely academic.
Well, audio research is purely academic.

Audio being subjective, people perceive it differently and the difficulty of doing meaningful tests again has already been gone over several times all make tests unpractical for a manufacture to carry out of each and every product.
At least do it for the developed technologies that supposedly work to produce better sound. If no proof exists, it’s just bullshit.

Forgetting? I mentioned several posts back how Schiit does their internal testing. I'm sure all manufactures do something similar.
Their testing, as you referenced it to be, does nothing to prove that their tech improves sound quality. They don’t control for most biases. And they don’t publish any results.

few really know how a DAC works
A DAC is an exceedingly simple device. It is given some integer value, and it then outputs a voltage that represents that value.

Sound quality is then just a matter of the timing of the values being fed in, and some basic things like noise, cross talk etc.

Timing wise, the only thing that matters is the last clock that feeds samples to the DAC, and that we don’t run out of data in the input buffer.

Any ”deeper” knowledge to DAC designs is completely unnecessary as far as this discussions topics are concerned (async vs. sync transfer of data to the input buffer). As long as the buffer next to the DAC is not empty at any time, it is impossible to hear a difference between the data acquisition methods. This is because they are not influencing the last clock in any way, nor the data itself. And even then, any difference in sound would be because of bad design, not whether we have sync or async input of data next to the DAC chip.
 
Joined
Jan 28, 2021
Messages
845 (0.71/day)
Well, audio research is purely academic.
So a company manufacturing audio gear is supposed to do a study / test that is purely academic is 100% on a subject that is 100% open to subjective impression? Why? and what would be the point?
At least do it for the developed technologies that supposedly work to produce better sound. If no proof exists, it’s just bullshit.
I don't understand this view point either. The technology and the product are intrinsically linked and also open to subjective impression.
Their testing, as you referenced it to be, does nothing to prove that their tech improves sound quality. They don’t control for most biases. And they don’t publish any results.
Nobody is looking for proof from manufactures. If third party publications take it on thats different and there is an audience for that but its still fundamentally problematic for the reasons outlined above and a lot of people shopping and buying this gear don't base their decisions on those types of publications.
A DAC is an exceedingly simple device. It is given some integer value, and it then outputs a voltage that represents that value.

Sound quality is then just a matter of the timing of the values being fed in, and some basic things like noise, cross talk etc.

Timing wise, the only thing that matters is the last clock that feeds samples to the DAC, and that we don’t run out of data in the input buffer.

Any ”deeper” knowledge to DAC designs is completely unnecessary as far as this discussions topics are concerned (async vs. sync transfer of data to the input buffer). As long as the buffer next to the DAC is not empty at any time, it is impossible to hear a difference between the data acquisition methods. This is because they are not influencing the last clock in any way, nor the data itself. And even then, any difference in sound would be because of bad design, not whether we have sync or async input of data next to the DAC chip.
You can describe any complex system in basic terms to explain how they work on conceptual level, that dosn't make them simple. There is ton of stuff happening and variety of approaches to go from integer values to voltage output of varying degrees of accuracy and thats where the sound quality of a particular DAC lies. We're just going in circles here so I won't go into how a realtime digital audio stream is inherently different again but that speaks to the question of timing.
 
Top