Hi Daiane,
Does your server really has 988GB of RAM?! That would be quite impressive and would definitely not be commodity hardware... You sure about that?
How large is your actual data set (number of files and aggregated size of all documents)? If you really have an enormous amount of data with lots of attributes and elements it could simply not work at the moment with BaseX - We would need to have a deeper look into your files.
Cheers, Dirk
On 31/01/14 12:53, DAIANE ROBERTA CANDIDA wrote:
Hello Dirk,
You were right about the command used to increase Java's heap size, but my loading files don't work yet.
The server has memory 988g When I put 500g in the option -Xmx it raised the error below:
Error occurred during initialization of VM Unable to allocate bit map for parallel garbage collection for the requested heap size. Error: Could not create the Java Virtual Machine. Error: A fatal exception has occurred. Program will exit.
So I decreased it from 500 to 350 then the following error appears
Out of Main Memory. You can try to:
- increase Java's heap size with the flag -Xmx<size>
- deactivate the text and attribute indexes.
I already deactivated the indexes
FTINDEX: OFF TEXTINDEX: OFF ATTRINDEX: OFF
What should I do to import data into the basex?
Regards,
Daiane.
BaseX-Talk mailing list BaseX-Talk@mailman.uni-konstanz.de https://mailman.uni-konstanz.de/mailman/listinfo/basex-talk