
Tested this latest version on my American extract with -Xmx4000m: With 1.2 million nodes the Java VM crashed due to lack of memory. Using 1 million nodes the split succeeded with 367 areas in 3:20 hours. Some swapping was noticed (bad for speed). Although I'd rather use the 1.2 million setting, as that is a nice balance between the number of failed map builds and map size worldwide, using 1 million now enables me to do all the work myself instead of relying on others. Great, thanks a million! Next up: splitting the whole world on my laptop and process the output with mkgmap. See which areas need fixing manually... Chris Miller wrote:
I've built a new version that *might* be able to handle the planet OK. I don't know how many areas North America breaks in to, but if you're able to handle 255 areas (at 1,600,000 nodes each) with an older version of the splitter, then I think this version should work for the whole planet:
http://redyeti.net/splitter.jar
If you still can't get it to work but are able to at least generate areas.list, then you could try reducing the number of nodes per area since that will directly reduce the amount of memory required when the split files are being written out.
Assuming no one finds any serious problems with this build, I'll get it checked in and released properly so everyone can benefit.