Hi Guys, I'm currently looking at the mircepstrum code and have a couple of questions: The documentation says, the min delay is 0s - the code says it's 0.0005s - I guess that's a (documentation) bug, right? I'm unfortunately not very familiar with cepstrums - is there some literature (you could point me to) that explains why I would like to use delays when creating the cepstrum rather than when interpreting it? This might also be a nice addition for the docs/code-comments. Also, I'm wondering, why does one want to specify a max delay? Isn't that given through the sample frequency and the size of the window? Thanks a lot for your insights, -hendrik