Skip to main content
Topic: Problems with memory (Read 4986 times) previous topic - next topic

Problems with memory

Hello,

I'm a new XCMS user, and I have problems loading my .cdf files. Around 10 minutes after I run the xcmsSet() function in R, the following message appears:

Error: cannot allocate vector of size 959.1 Mb
In addition: Warning messages:
In netCDFRawData(cdf) :
Reached total allocation of 3758Mb: see help(memory.size)

I tried loading files separately and then using the c() function, but this happens even when I load a single file. According to the R documentation, since I'm running a 64-b R in 64-b Windows the max. I can handle is 8Tb, but I have the understanding that such function relies on my RAM (which is ca. 4Gb, so makes sense), so it maybe a hardware impediment. Is it so? Because I need to handle a lot of pretty large data files (I'm doing untargeted metabolomics), so is there an indirect way in which I can load a big file ?  do you have any recommendation for managing such ammounts of data?

Thanks

Re: Problems with memory

Reply #1
To process large files, I think the simplest way to get it work is to upgrade your memory to 8 GB or more.

You can also try to preprocess your files (centroid mode, noise filter).

Ralf.

 

Re: Problems with memory

Reply #2
I will try to get it to work in a better computer then, as I want to use raw data...thank you!