Hi guys,
I'm running BaseX as server; it has some processing to do every now and then, huge pieces of data. But most of the times it's idle. The "pieces of data" are huge XML files that are ADD'ed to a new DB, then read. Basically I'm using BaseX as an intermediate random-access parser/indexer for some huge XML files.
Question 1:
XML files are such that I need to specify -Xmx2048M, otherwise I get an out of memory error when ADDing the files. However, I notice that in the BaseX GUI adding the same XML files reports a memory usage of <300M. Is there some special option that the GUI is using that I could use too on the server so that memory usage is not so severe?
Question 2:
After the data is extracted, it's no longer needed and I DROP the DB; also connection is closed. But memory (the huge 2G mentioned above) is never returned to the system.
The script I use to run BaseX is:
export BASEX_JVM="-Xmx2048m -XX:MinHeapFreeRatio=10 -XX:MaxHeapFreeRatio=20 -XX:+UseSerialGC -Dorg.basex.LOG=false -Dorg.basex.DBPATH=/var/basex/data -Dorg.basex.REPOPATH=/var/basex/repo" BaseX/bin/basexserver -S
So basically I tried specifying MaxHeapFreeRatio and SerialGC for java, but it's no improvement and it doesn't help so I assume the memory isn't hogged in java... is there a way to free up the memory once operations complete (like mentioned above, "complete" means created DB is dropped, connection closed, waiting for another batch to start over).
Thanks, Dinu