|
charles.michelich
BIAC Alum
   
USA
183 Posts |
Posted - Jan 13 2004 : 1:36:35 PM
|
SPM Users,
SPM's statistical estimation routines process the data in chunks. In general, as the chunk size increases, the estimation time decreases up until you run out of memory. When you run out of memory SPM will either crash with an "Out of memory" error, or get very slow because it is attempting to use the hard drive as memory. If you are running out of memory, you may want to try adjusting the chucking size.
In SPM 2 the size of the chunk is controlled by the variable "defaults.stats.maxmem" in the file spm_defaults.m. To use a different chunking size copy the original spm_defaults.m file into your local scripts directory (or somewhere in your path) and edit this file by changing "defaults.stats.maxmem" to a different number.
In SPM 99 the size of the chunk is controlled by the variable "maxMem" in the file spm_spm.m. To use a different chunking size, copy the original spm_spm.m file into your local scripts directory (or somewhere in your path) and edit this file by changing maxMem to a different number. Unfortunately, the file spm_spm.m also contains alot of other code, so if an fixes are made to this file, they will not be in your local copy. Therefore, you may want to occationally check and make sure that spm_spm.m has been updated.
Since several people in lab have run out of memory using SPM 99, I have changed the default to 64 MB (2^26) from 512 MB (2^29).
Enjoy, Chuck
|
|