1
XCMS / How are XCMS's retention times calculated
I'm noticing some strange things regarding the reported RTs (xcmeSet@peaks[,"rt"]) from XCMS. I'm using findPeaks.centWave.
My initial confusion was from the following data, a plot of the retention time difference between a M+0 peak and its isotope. You can see the retention times come in discrete chunks rather than being continuously distributed.
[attachment=2:3bz8gt4p]discrete.png[/attachment:3bz8gt4p]
The bins appear to correspond to my scan rate which made me question how the RTs were being calculated - so I tried to figure out how this was done. Unfortunately none of my methods reproduced the reported
[attachment=1:3bz8gt4p]a peak.png[/attachment:3bz8gt4p]
Black: xcms reported RT
Blue: Peak centroid (of all points within rtmin:rtmax and mzmin:mzmax using rawEIC)
Red: Peak maxo
Green: Half way through peak (rtmin:rtmax)
Additionally, I was surprised at how far off some the peaks were from my calculated centroid. For example this was good shaped peak where I can't figure out what the reported RT corresponds to.
[attachment=0:3bz8gt4p]rt_calc.png[/attachment:3bz8gt4p]
In summary:
1. How is the reported RT chosen - should it not be a continuous variable?
2. What do rtmin/rtmax correspond to (these are the limits of the above EICs)?
Thanks!
Ps. I did not find the answer to these questions in the centWave paper.
[attachment deleted by admin]