Page 1 of 1

File writing/reading error detection (repeated sections)

Posted: Fri Dec 22, 2017 2:39 am
by Damien Cassidy
Hi,

I have an issue where a company has supplied BWF files (from analog tape digitisation) where a suspected faulty CF Card has caused the audiostream to jump, stutter and repeat at times. It's a needle in a haystack issue to find by ear and is potentially affecting 500+ files up to 2 hours each so listening isn't really an option. I figure that there must be someway of detecting where a PCM stream is identical to an other section of the same file. Is there anything in WL that can test for this?

If anyone knows any alternatives (I've tried Dobbin and Baton) I'd really appreciate the help.

Damien

Re: File writing/reading error detection (repeated sections)

Posted: Fri Dec 22, 2017 7:48 am
by bob99
I don't know how it works or how configurable it is, but have you considered "Find Similar" in Izotope RX?

I don't know of anything in Wavelab specifically for this. Unless there are tiny digital dropouts or sharp waveform discontinuity at the in and out points where the card skipped. Wavelab Global Analysis Errors/Glitches will find those. I usually set for high threshold and sensitivity (like 99 and 99), but you could try lower as well, although if you do you might get a lot of false positives.

edit: then again "Find Similar" probably requires an initial selection in order to find similar, so possibly wouldn't be useful at all. But I haven't used it so don't know what it can do beyond that.

Re: File writing/reading error detection (repeated sections)

Posted: Fri Dec 22, 2017 9:26 am
by PG
Unless there are tiny digital dropouts or sharp waveform discontinuity at the in and out points where the card skipped.
This is a high chance (and only one, I would say).
Wavelab Global Analysis Errors/Glitches will find those. I usually set for high threshold and sensitivity (like 99 and 99), but you could try lower as well, although if you do you might get a lot of false positives.
The Error Detection tool (specific Tool window) is certainly a better alternative.
Another visual tool is to use the Spectrogram: you can generally see when when there is an unexpected click.

Re: File writing/reading error detection (repeated sections)

Posted: Fri Dec 22, 2017 11:49 am
by bob99
PG wrote:
Fri Dec 22, 2017 9:26 am
The Error Detection tool (specific Tool window) is certainly a better alternative.
Zoomed in screenshots of a couple of the transitions might help. If there are digital mutes, 5-10ms, I've found the only thing that can find those is the Global Analysis Error/Glitch tab, and it's extremely effective. I was never able to get the Error Detection tool to detect that particular type of problem, no matter what the settings. I'm not saying that's the case here, but I think it depends on what the transitions are like. If there are sharp discontinuities and not mutes, I think it's worth testing with both the Global Analysis and the Error Detection tool on the transition points of some known bad areas to see which finds the points with the fewest false positives.

Re: File writing/reading error detection (repeated sections)

Posted: Fri Dec 22, 2017 12:03 pm
by PG
I've found the only thing that can find those is the Global Analysis Error/Glitch tab, and it's extremely effective. I was never able to get the Error Detection tool to detect that particular type of problem, no matter what the settings. I'm not saying that's the case here, but I think it depends on what the transitions are like.
Interesting to know. Global Analysis and Error Detection panel use totally different methods.

Re: File writing/reading error detection (repeated sections)

Posted: Fri Dec 29, 2017 1:39 am
by Damien Cassidy
This issue isn't generating clicks or drop outs but it does create misaligned waveforms.
Stutter_Issue.JPG
(24.7 KiB) Not downloaded yet
I had some success with global analysis/glitches by setting threshold and sensitivity to 99 though still not finding all. I'll play a bit more but it's inspired me to delve deeper with some of my other QC tools and see if I can't get things working as a group effort in an automated workflow.

Re: File writing/reading error detection (repeated sections)

Posted: Fri Dec 29, 2017 3:22 am
by bob99
Damien Cassidy wrote:
Fri Dec 29, 2017 1:39 am
This issue isn't generating clicks or drop outs but it does create misaligned waveforms.
Stutter_Issue.JPG
I had some success with global analysis/glitches by setting threshold and sensitivity to 99 though still not finding all. I'll play a bit more but it's inspired me to delve deeper with some of my other QC tools and see if I can't get things working as a group effort in an automated workflow.
As you've probably found, adjustment of those settings does make a difference in number of "glitches" found, but it's a tradeoff finding too many (false positive) or too few. From the manual:

"Threshold sets the value at which a change in level is considered to be a glitch. The higher the value, the less sensitive the detection."

"Sensitivity is a length value that represents the length of time in which the waveform must exceed the threshold to be reported as a glitch. The higher the value, the less sensitive the detection."

After detection, you can create markers at the hot points at the bottom of the window.

Strangely, I've just found I've gotten more hits using threshold 30 and sensitivity 99, than using threshold 30 and sensitivity 30, on at least one track, as if the sensitivity is more sensitive at 99 than at 30. Not sure what to make of that, but it's useful to experiment with the numbers.

Re: File writing/reading error detection (repeated sections)

Posted: Fri Dec 29, 2017 5:02 am
by Damien Cassidy
Out of interest, does WL 9.x facilitate batch analysis, ie/ systematically doing global analysis in a batch process, writing markers at hot points?

Re: File writing/reading error detection (repeated sections)

Posted: Fri Dec 29, 2017 6:28 am
by bob99
Damien Cassidy wrote:
Fri Dec 29, 2017 5:02 am
Out of interest, does WL 9.x facilitate batch analysis, ie/ systematically doing global analysis in a batch process, writing markers at hot points?
I don't see Global Analysis, or Error Detection, in the 9.5 Batch, but that's a great idea.

Looking at your screenshot, I think PG was probably right that Error Detection and Correction would find those spots as well. Worth testing anyway.