Dear all,

 

I’m working on a collection containing  tens of millions of documents, updated weekly.

My first guess to store these documents was to simply db:add/replace/remove them.

This solution is slowing things down as the count of documents increases.

For example, opening one collection takes up to 4 minutes.

 

I believe that the document list is an in memory structure.

 

Is there a way to speed things up, or do I have to change sides with the following guess, in order to reduce the ‘physical’ document list size :

 

1.group documents in ‘logical’ documents on insertion (fewer documents containing new or updated documents under a root xml element)

2. remove the old version of these documents in the previous ‘logical’ documents with xquery update.

 

Has anybody already find that problem, and a workaround ?

 

BaseX is just fantastic !

 

Best regards,

Fabrice  Etanchaud

Questel/Orbit