Skip to main content
Recent Posts
1
XCMS Online / Re: File check and TICs failed
Last post by dakshatt -
Just checking if there has been a solution to this. I have 500GB+ worth of data that is uploaded on XCMS online but unable to do anything with it. Once I post the job just after 5% progress I get the same error. FAQs do not seem to point in the right direction. Tried their resolution but nothing has changed.
3
XCMS - FAQ / Re: Error in do_adjustRtime_peakGroups(peaks = peakmat, peakIndex = object@groupidx,
Last post by johannes.rainer -
Dear Yuechen,

the *peak density* alignment method requires a certain number of features (AKA grouped peaks) across all samples to perform the alignment. Without knowing more about your data it is pretty hard to tell what the problem is. I'd suggest you redo the correspondence analysis (peak grouping) with less stringent settings and retry.

Secondly: I would suggest that you switch over to the *new* user interface and functions (see https://bioconductor.org/packages/release/bioc/vignettes/xcms/inst/doc/xcms.html for details). There you will e.g. also have the possibility to do the alignment on a subset of samples (e.g. if you have QC samples) or to exclude blank samples from the alignment (these in fact could cause the problem described above).

jo
4
XCMS - FAQ / Error in do_adjustRtime_peakGroups(peaks = peakmat, peakIndex = object@groupidx,
Last post by genefisher -
Hey,
Anybody knows how to solve this problem with XC-MS? It shows:
> xset2 <- retcor(xset1, family= "s", plottype= "m", missing=1, extra=1,
+ span=1)
Performing retention time correction using 2 peak groups.
Error in do_adjustRtime_peakGroups(peaks = peakmat, peakIndex = object@groupidx,  :
  Not enough peak groups even for linear smoothing available!
In addition: Warning message:
In do_adjustRtime_peakGroups(peaks = peakmat, peakIndex = object@groupidx,  :
  Too few peak groups for 'loess', reverting to linear method


Many thanks!

Yuechen
5
XCMS Online / change columns in result table
Last post by Anni123 -
Hello,
I am not able to change the columns in the result table. I would like to view "MS/MS" or "Metlin_MS/MS" but the result table does not change if I select these columns as well. I also tried other Internet Browser (Firefox, Chrome...).
I can see that there would be MSMS results because there is a "Location of MS/MS scans" but I am not able to view it in the result table.
Does anyone know how to change that?
Thank you in advance for your answer.
kind regards,
Anni
8
Mass spectrometers / Re: Suggestion to purchase of a high-resolution MS
Last post by CoreyG -
Hi Sam,

I don't have a lot of HRMS experience, but I've gradually been doing more. Our focus is high-throughput, so robustness is something we care about as well. As a general rule, the top spec instruments tend to be less robust than the lower tier models, although there are always exceptions.

So far I've used an Agilent qTOF (6540) and two Thermo Orbitraps (Fusion Lumos and HF-X). All of these were setup to do analytical flow LC-MS.
The Agilent system had quite a few boards replaced over a few years, but I can't say that the instrument was looked after particularly well.
The Fusion Lumos ran well for over 1000 continuous samples (plasma). But other than that, I am not sure how robust the machine is outside of that.
The HF-X had a bad reputation for 'dirtying', but Thermo says they have fixed this in the current generation. If you are willing to wait a few months I can fill you in on how it goes :)

For the work that I was doing (lipidomics), the much higher resolution given by the orbitrap was very useful. Overall sensitivity was higher with the orbitraps as well - but we are comparing 2 very new instruments to a much older one.
However, both the orbitraps have very rough ion funnels. This causes in-source fragmentation of fragile molecules. This isn't exclusive to Thermo instruments.
Most vendors are aware of this, so you should bring it up with them if this is a concern.

Regarding quantitation, there are different levels of quantitation that people expect. Most metabolomics/lipidomics people have a relaxed view on quantitation. That is, there are caveats and assumptions that everyone excepts can't be resolved (not enough internal standards, shotgun vs LC...)
Most newer instruments provide a fairly decent dynamic range. Certainly much more than the expected variability you would see in a single metabolite in a group of people. So I guess it's important to ensure that sample prep is correct to put the right amount of 'stuff' in the machine.

I'm sure others can chime in with some more experiences and knowledge.

Cheers,
Corey
9
Mass spectrometers / Suggestion to purchase of a high-resolution MS
Last post by Sam Zhang -
We plan to buy a high-resolution LC-MS system with enough dynamic range for quantification.  Currently, our applications are focused on small-molecule drug metabolism.  We particularly care about robustness.  We also care about performance, software, database, etc.  It will be highly appreciated if anyone can message me direct user experience from you and your colleagues/friends.  The experience from either one or multiple models of instruments will be helpful.  Thanks.       
10
Other / Re: Peak alignment with large dataset (over 2500 samples and growing)
Last post by metabolon1 -
Hello folks,

Some updates and a few more questions.

I have been using the 'IPO' package to determine my XCMS parameters. The resulting peak table contains over 50,000 features. I'm ok with this for now. Has anyone else used IPO? Thoughts?

I successfully ran CAMERA (thanks CoreyG for the suggestions) with the goal of finding redundant features (eg. adducts/isotopes). It identified~7,000 pc groups in the ~50,000 features. Now, if I am understanding correctly, I need to select one feature from each pc group to serve as the "representative" for that compound. Any recommendations on how to make this selection?

Many thanks!