OK, I've being doing some measurements using the IQ_test I mentioned before & I may have a handle on what's going on but it is tentative & needs to be analysed/discussed/teased out.
The IQ_test allows a measure of the timing differences that may occur when a specially generated audio file is played & recorded. This audio file is a stereo one with mathematically known timing differences between channels. This is an important point that I'll come back to later but any timing differences found in the plots are because there is a difference between the timing of the signal in the two channels.
Firstly we should look at what a plot of a perfect reproduction of the file would look like:
So this graph needs some explanation. The X-axis is the frequency range from 0Hz to 22KHz. The Y axis is the timing slip in microseconds. In other words any timing differences at all frequencies will be shown. So this graph shows a predominantly flat line at zero i.e no timing slips at any frequency. The spikes at 6KHz, 12KHz & 19KHz are some anomalies of the software which I haven't been able to track down but they can be ignored in the following plots. The real point is to show what a perfect playback would look like - a flat line across all the frequencies. How I generated this plot was to feed the generated file directly into the analysis software without going through any playback & recording steps.
I firstly decided to do these measurements through the analogue out of the laptop as I hoped that it might exaggerate any differences between playback software & maybe give some hints of what differences to look for in the plots. The timing errors in these plots will be a result of a number of things, not just the playback software:
- the computer being used for playback
- the DAC converting it to analogue
- the recording device being used for recording
- the playback software itself.
So by only varying the playback software it is hoped that we can see some changes in the plots if they are not swamped by the timing issues of the rest of the playback & recording chain.
So first a look at playback through Foobar directly out of the analogue out of the laptop & recorded onto a Zoom H2 recorder
You can see that the timing slippage gets worse (the rising slope) as the frequency rises. Also that there are mis-timing fluctuations (i.e it's not a straight clean line but a fuzzy up & down line) at all frequencies. And that these fluctuations increase (more fuzziness) as the frequency increases.
Now for MQN through laptop.
As you can see it looks very similar to the Foobar plot with no glaring differences.
OK, but this is a very high level plot across the full frequency spectrum which shows timing delays rising to 90 milliseconds at the higher frequencies.
So I said the channel timing differences are important & here's why - in the real world we localise sound by the timing differences we can perceive in the same signal that reaches each ear. We are very sensitive to this timing difference as it is a necessary survival mechanism. There are psychoacoustic tests that have shown in the right circumstances a timing difference of 1.5microseconds can be sensed. This seems very low & probably 5 microseconds is a reasonable figure. What this means is that at 10 feet away we can sense a sound source movied 3 inches left or right.
Anyway, what this means is that we perceive the sound stage of our audio playback mostly (inter-aural intensity differences are also important - IID) through these timing differences arriving at our ears. These timing differences have been captured by the recording engineer with whatever mic pattern he used & also later by panning sounds to the left or right.
To perceive a solid sound stage the recording has to be done properly & the inter-channel timings that whatever mic configuration picks up have to be preserved.
I have seen it stated by researchers that the main frequency range which is used in this way is the range 300 to 5KHz so here's a plot of 300 to 1KHz for starters (Edit Actually I think it may be 500 or 700 upwards? So another graph is needed to zoom in on this area)
Firstly Foobar
And MQN
With this zoomed in view I believe we can see that MQN shows a slope that is lower than Foobar - in other words it has lower timing errors at each point in this frequency range.
This may be one of the clues to what we are hearing - it results from lower timing errors when played back through MQN.
I have further plots which are done playing back through an external DAC (Ciunas) rather than the laptop's analogue outs & this again shows a generally lower slope than the laptops plots. So, my tentative conclusion is that we don't have perfect playback timing but we have improved timing. This timing varies according to frequency - rising towards higher frequencies & this timing variation interferes with the sound stage solidity. If the plot showed a fixed timing across all frequencies i.e a flat horizontal plot then we would have no issue, I believe - it would produce the best sound stage.
BTW, if anyone is interested, I can supply the analysis files which can be plotted, zoomed in or out & changed in various ways to get different views into the results
Of course all this could be rationalisation of the results? That's why getting a body of results together would give a better handle on the trend that we are looking for.
Oops, I see my dropbox links don't show as images? Too late to do anything with them now - maybe one of the mods can help to host these? Here's the dropbox folder for the files
https://www.dropbox.com/sh/6v25r8woxupbesb/DGmKOJQnjK