I recognise your problem, and reported it, but never got back to it with more details. I used BaseX client/server 7.5 beta. My first database contained 2.7 million documents, but I created a new one from an exported subset of 700k documents. That helped lower the memory use directly after loading the DB.
Any chance you use the SQL module in your processing?
My guess was that it had been a design choice to keep previously opened documents from a database in use in memory. But running out of memory probably wasn't ;)
Ben
On 20 May 2013 04:32, Christopher.R.Ball christopher.r.ball@gmail.com wrote:
I have a BaseX script (.bxs) I am running that does queries in batches (sets of 5k documents), but as it progresses it bogs down in speed, does not release memory between sets even if I force it to close and reopen the db between queries, and eventually runs out of memory.
But, if I break the same BaseX script into separate files still doing the same exact batches it is extremely fast and memory efficient.
Very suggestive of a memory leak . . .
I am running on BaseX 7.6.1 Beta.
Any thoughts?
Is there a way to force the script to do garbage collection?
BaseX-Talk mailing list BaseX-Talk@mailman.uni-konstanz.de https://mailman.uni-konstanz.de/mailman/listinfo/basex-talk