[MLton-user] gcc memory exhaustion

Matthew Fluet fluet at tti-c.org
Sat Apr 18 08:33:04 PDT 2009


On Sat, 18 Apr 2009, Dan DuVarney wrote:
> I'm attempting to use SVN r7016 MLton/MinGW to compile a large program
> with time-profiling enabled. MLton generates a 235MB .c source file
> which gcc (3.4.5) chokes on (i.e., runs out ofmemory). I've saved the
> source files and tried compiling the MLton-generated code without any
> optimizations enabled, but haven't been able to successfully compile.
> I'm using a Windows/XP machine with 3GB memory.

Is this using the C codegen or the native codegen?  If it is using the 
native codegen (the default on an x86 platform), then there will be only 
one .c file that includes global and static data for the program 
(including profiling source location data).  That one file can't really be 
split into smaller pieces.  On the other hand, it is mostly static data, 
so I'd be suprised that gcc has difficulty with it.

> Does anyone have any suggestions on how to successfully compile this
> program?  Perhaps something similar to the -coalesce option (which
> appears to have been removed -- please let me know if that is not the
> case) which will produce multiple smaller source files in place of this
> one large file?

The option has been renamed to  '-chunkify coalesce<N>'.  You might also 
try '-chunkify func', which is akin to disabling chunking of multiple IL 
functions into a single C function.



More information about the MLton-user mailing list