Wont using a function like db:add be the same as reprocessing that piece of xml? ex; gathering statistics, reprocessing the db indexes etc.
But we have already processed that piece of xml, during the update on the other copy of the database.

Maybe this example will show more what I mean.
Suppose we have a huge xml document and creating the baseX database takes a lot of time.
We would like to split that document, pass it to multiple baseX instances and finally merge the results, hoping to increase performance.
But if that merge is taking place by adding the change in one db to the other with db:add, wont it be like reprocessing that change?

Best,
Marios

On Thu, Mar 12, 2015 at 3:48 PM, Christian GrĂ¼n <christian.gruen@gmail.com> wrote:
>> One more approach is to store new documents in a separate database, which
>> can then be merged with the main database.
> The is the key then. Is that possible?

Absolutely ;) It can e.g. be realized with the functions in the
database module [1] (but this requires some experience with XQuery).

Best,
Christian

[1] http://docs.basex.org/wiki/Database_Module


>> Hi Marios,
>>
>> Thanks for your response.
>>
>> > So the problem I think comes down to how to merge two databases into one
>> > without having to reprocess the whole content.
>>
>> One more challenge I think will be to find out which documents have
>> changed on a particular server.
>>
>> Maybe it's easier to have one dedicated server that receives new
>> documents, which can then be distributed to the remaining server. One
>> more approach is to store new documents in a separate database, which
>> can then be merged with the main database. As database are very
>> light-weight constructs in BaseX, you can e.g. address more than one
>> database from a single XQuery expression [1].
>>
>> Hope this helps,
>> Christian
>>
>> [1] http://docs.basex.org/wiki/Databases
>
>