Hi There,
I have created an script which will setup a job by job:eval() function of BaseX to perform multiple actions on the inputs before ingesting them into the database.
I have to create a report on each and every success/failure steps on each job. For this I have created per job XML file within the database under '/job/' directory, it is an update function.
Everything is working fine, except user is unable to perform any other action within the database at the time of job execution. It seems, our running job is blocking whole database.
Our job may take 20 minute to 2 hour to finish.
Is this the correct behavior of BaseX to block whole database (even for reading)? or I am doing something wrong? Please suggest.
Regards, Navin
Hi Navin,
Is this the correct behavior of BaseX to block whole database (even for reading)? or I am doing something wrong? Please suggest.
This depends on your queries (see [1]). You can use jobs:list-details() to check if your jobs lead to local or global locks.
Best, Christian
Hi Christine,
It is global lock. How can I prevent it? Should I need to avoid access database while running jobs. It will work, or there is something else.
Regards, Navin
On 08-Nov-2017 9:33 PM, "Christian Grün" christian.gruen@gmail.com wrote:
Hi Navin,
Is this the correct behavior of BaseX to block whole database (even for reading)? or I am doing something wrong? Please suggest.
This depends on your queries (see [1]). You can use jobs:list-details() to check if your jobs lead to local or global locks.
Best, Christian
Hi Navin,
It is global lock. How can I prevent it? Should I need to avoid access database while running jobs. It will work, or there is something else.
I invite you to check out the link that I have referenced in my last mail, it’s described in the documentation.
Christian
On Thu, Nov 9, 2017 at 4:19 AM, Navin Rawat nrawat2005@gmail.com wrote:
Hi Christine,
Regards, Navin
On 08-Nov-2017 9:33 PM, "Christian Grün" christian.gruen@gmail.com wrote:
Hi Navin,
Is this the correct behavior of BaseX to block whole database (even for reading)? or I am doing something wrong? Please suggest.
This depends on your queries (see [1]). You can use jobs:list-details() to check if your jobs lead to local or global locks.
Best, Christian
Hi,
As I'm working around the same sort of problem:
* I check for every xquery:eval, xquery:update and make it a jobs:eval, they have to cause a global read or write lock respectively.
* I put every db:* (e. g. db:list()) in the code into it's own short running job
* I replace every doc($var) and collection($var) with a doc("stringconst") or collection("stringconst") (for example by using the bxcode code for binding external db name variables mentioned earlier on this list)
* I split my large DB consisting of about 700 XML documents with 2.5 GB of data into 700 databases with about 5MB of data each.
* I stopped using DBA modules Queries interface because this implicitly uses xquery:eval or xquery:update and so always leads to a global read or write lock.
This will open a lot of opportunities for read queries to see a somewhat inconsistent state but I'm willing to accept this.
So now I am down to a few seconds or less of locking for any particular db. I hope this will enable me to do updates while not blocking read queries for a longer period of time. I'm just working on a REST based test for this.
Best regards
Omar Siam
Am 09.11.2017 um 04:19 schrieb Navin Rawat:
Hi Christine,
It is global lock. How can I prevent it? Should I need to avoid access database while running jobs. It will work, or there is something else.
Regards, Navin
On 08-Nov-2017 9:33 PM, "Christian Grün" <christian.gruen@gmail.com mailto:christian.gruen@gmail.com> wrote:
Hi Navin, > Is this the correct behavior of BaseX to block whole database (even for > reading)? or I am doing something wrong? Please suggest. This depends on your queries (see [1]). You can use jobs:list-details() to check if your jobs lead to local or global locks. Best, Christian [1] http://docs.basex.org/wiki/Transaction_Management <http://docs.basex.org/wiki/Transaction_Management>
Hi Omar,
Are you able to avoid global locking?
However, I don't think that it is the right option to create multiple DB to avoid locking.
As per my opinion, BaseX should improve its locking functionalities. Instead of blocking whole database, it must block the particular URI only for update only which will resolve the problem effectively.
Anyway, thanks for sharing your working strategy.
Regards, Navin
On 13-Nov-2017 4:01 PM, "Omar Siam" Omar.Siam@oeaw.ac.at wrote:
Hi,
As I'm working around the same sort of problem:
- I check for every xquery:eval, xquery:update and make it a jobs:eval,
they have to cause a global read or write lock respectively.
- I put every db:* (e. g. db:list()) in the code into it's own short
running job
- I replace every doc($var) and collection($var) with a doc("stringconst")
or collection("stringconst") (for example by using the bxcode code for binding external db name variables mentioned earlier on this list)
- I split my large DB consisting of about 700 XML documents with 2.5 GB of
data into 700 databases with about 5MB of data each.
- I stopped using DBA modules Queries interface because this implicitly
uses xquery:eval or xquery:update and so always leads to a global read or write lock.
This will open a lot of opportunities for read queries to see a somewhat inconsistent state but I'm willing to accept this.
So now I am down to a few seconds or less of locking for any particular db. I hope this will enable me to do updates while not blocking read queries for a longer period of time. I'm just working on a REST based test for this.
Best regards
Omar Siam
Am 09.11.2017 um 04:19 schrieb Navin Rawat:
Hi Christine,
It is global lock. How can I prevent it? Should I need to avoid access database while running jobs. It will work, or there is something else.
Regards, Navin
On 08-Nov-2017 9:33 PM, "Christian Grün" christian.gruen@gmail.com wrote:
Hi Navin,
Is this the correct behavior of BaseX to block whole database (even for reading)? or I am doing something wrong? Please suggest.
This depends on your queries (see [1]). You can use jobs:list-details() to check if your jobs lead to local or global locks.
Best, Christian
As per my opinion, BaseX should improve its locking functionalities. Instead of blocking whole database, it must block the particular URI only for update only which will resolve the problem effectively.
That would be nice indeed. It would surely be easier to realize if the language features of XQuery were more restricted (similar to SQL). One long-term solution would be the switch to MVCC.
On 13-Nov-2017 4:01 PM, "Omar Siam" Omar.Siam@oeaw.ac.at wrote:
Hi,
As I'm working around the same sort of problem:
- I check for every xquery:eval, xquery:update and make it a jobs:eval,
they have to cause a global read or write lock respectively.
- I put every db:* (e. g. db:list()) in the code into it's own short
running job
- I replace every doc($var) and collection($var) with a doc("stringconst")
or collection("stringconst") (for example by using the bxcode code for binding external db name variables mentioned earlier on this list)
- I split my large DB consisting of about 700 XML documents with 2.5 GB of
dviata into 700 databases with about 5MB of data each.
- I stopped using DBA modules Queries interface because this implicitly
uses xquery:eval or xquery:update and so always leads to a global read or write lock.
This will open a lot of opportunities for read queries to see a somewhat inconsistent state but I'm willing to accept this.
So now I am down to a few seconds or less of locking for any particular db. I hope this will enable me to do updates while not blocking read queries for a longer period of time. I'm just working on a REST based test for this.
Best regards
Omar Siam
Am 09.11.2017 um 04:19 schrieb Navin Rawat:
Hi Christine,
It is global lock. How can I prevent it? Should I need to avoid access database while running jobs. It will work, or there is something else.
Regards, Navin
On 08-Nov-2017 9:33 PM, "Christian Grün" christian.gruen@gmail.com wrote:
Hi Navin,
Is this the correct behavior of BaseX to block whole database (even for reading)? or I am doing something wrong? Please suggest.
This depends on your queries (see [1]). You can use jobs:list-details() to check if your jobs lead to local or global locks.
Best, Christian
basex-talk@mailman.uni-konstanz.de