Problems with memory June 21, 2011, 04:45:10 PM Hello,I'm a new XCMS user, and I have problems loading my .cdf files. Around 10 minutes after I run the xcmsSet() function in R, the following message appears:Error: cannot allocate vector of size 959.1 MbIn addition: Warning messages:In netCDFRawData(cdf) :Reached total allocation of 3758Mb: see help(memory.size)I tried loading files separately and then using the c() function, but this happens even when I load a single file. According to the R documentation, since I'm running a 64-b R in 64-b Windows the max. I can handle is 8Tb, but I have the understanding that such function relies on my RAM (which is ca. 4Gb, so makes sense), so it maybe a hardware impediment. Is it so? Because I need to handle a lot of pretty large data files (I'm doing untargeted metabolomics), so is there an indirect way in which I can load a big file ? do you have any recommendation for managing such ammounts of data?Thanks Quote Selected
Re: Problems with memory Reply #1 – June 21, 2011, 07:16:09 PM To process large files, I think the simplest way to get it work is to upgrade your memory to 8 GB or more.You can also try to preprocess your files (centroid mode, noise filter).Ralf. Quote Selected
Re: Problems with memory Reply #2 – June 26, 2011, 10:07:50 PM I will try to get it to work in a better computer then, as I want to use raw data...thank you! Quote Selected