I’m continuing to test my process for loading data which depends on optimizing databases, some of which are pretty large (100+MB with 100s of 1000s of elements). I’m testing on both macOS and linux using the 27 Feb build on macOS and the 29 build on linux (just a matter of when I downloaded them).

 

What I’m seeing is that when I test with a relatively small content set the optimization completes reliably and everything works as it should.

 

When I test with a realistically large data set, the optimization either takes a very long time (as much as an hour to complete) or never completes with the server at 100% CPU utilization. It seems to be worse on macOS but it’s difficult to verify, partly because a test takes several hours to run.

 

I have the BaseX source code available locally, although I’m unable to compile with maven due to internal maven issues (we have a pretty locked down maven proxy and I don’t know maven well enough to know how I might configure around that).

Is there anything I can do to diagnose this issue to at least confirm or deny that there are still deadlock issues with the optimization?

I assume that it should not take 10s of minutes to optimize even a large database.

 

Here's the details for a typical database that has failed to optimize on macOS:

 

Thanks,

 

Eliot

_____________________________________________

Eliot Kimber

Sr Staff Content Engineer

O: 512 554 9368

M: 512 554 9368

servicenow.com

LinkedIn | Twitter | YouTube | Facebook