Hi,
We've been having this issue for a while and we think resolving it may be the key to resolving an intermittent server 500 error that we've been having.
When a user tries to save a file while a batch process is running, BaseX saves duplicates of the file.
How to reproduce:
1) Take a fresh BaseX 9.0.1 installation 2) Copy the attached .xqm in webapp 3) Create an empty DB called mydb 4) Access localhost:port-num/test/create-update-a-lot-of-files to populate your db. 5) In OxygenXML, set a webdav connection to the db and open a file, add a character in one of the elements, but don't save the file. 6) From the browser, access 'localhost:port-num/test/update-something' 7) While the process in the browser is still running, save the file in Oxygen. You'll get a message saying that read timed out. Click ok and do not try saving the file again. 8) When the update-something process is done running, don't resave the file in Oxygen, instead go to localhost:port-num/test/oups-duplicates. You'll get a message saying that some files are duplicated. If you don't try again from step #4 a few times. You'll only get duplicates if you get the time out message before the update-something process is still running. If you try to save the file many times, you'll get more duplicates, 4 or 6.
We're not sure if it's a BaseX bug or if we are setting our user management and/or locking rules incorrectly.
Do you have any suggestions?
Hi,
Just wondering if this slipped through the cracks.
On Wed, May 2, 2018 at 1:11 PM, France Baril france.baril@architextus.com wrote:
Hi,
We've been having this issue for a while and we think resolving it may be the key to resolving an intermittent server 500 error that we've been having.
When a user tries to save a file while a batch process is running, BaseX saves duplicates of the file.
How to reproduce:
- Take a fresh BaseX 9.0.1 installation
- Copy the attached .xqm in webapp
- Create an empty DB called mydb
- Access localhost:port-num/test/create-update-a-lot-of-files to
populate your db. 5) In OxygenXML, set a webdav connection to the db and open a file, add a character in one of the elements, but don't save the file. 6) From the browser, access 'localhost:port-num/test/update-something' 7) While the process in the browser is still running, save the file in Oxygen. You'll get a message saying that read timed out. Click ok and do not try saving the file again. 8) When the update-something process is done running, don't resave the file in Oxygen, instead go to localhost:port-num/test/oups-duplicates. You'll get a message saying that some files are duplicated. If you don't try again from step #4 a few times. You'll only get duplicates if you get the time out message before the update-something process is still running. If you try to save the file many times, you'll get more duplicates, 4 or 6.
We're not sure if it's a BaseX bug or if we are setting our user management and/or locking rules incorrectly.
Do you have any suggestions?
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
Hi France,
I’ve just returned after a little break. Thanks for the elaborated instructions; I’ll follow the described steps in the course of this week.
Best, Christian
On Sun, May 13, 2018 at 3:36 PM, France Baril france.baril@architextus.com wrote:
Hi,
Just wondering if this slipped through the cracks.
On Wed, May 2, 2018 at 1:11 PM, France Baril france.baril@architextus.com wrote:
Hi,
We've been having this issue for a while and we think resolving it may be the key to resolving an intermittent server 500 error that we've been having.
When a user tries to save a file while a batch process is running, BaseX saves duplicates of the file.
How to reproduce:
- Take a fresh BaseX 9.0.1 installation
- Copy the attached .xqm in webapp
- Create an empty DB called mydb
- Access localhost:port-num/test/create-update-a-lot-of-files to populate
your db. 5) In OxygenXML, set a webdav connection to the db and open a file, add a character in one of the elements, but don't save the file. 6) From the browser, access 'localhost:port-num/test/update-something' 7) While the process in the browser is still running, save the file in Oxygen. You'll get a message saying that read timed out. Click ok and do not try saving the file again. 8) When the update-something process is done running, don't resave the file in Oxygen, instead go to localhost:port-num/test/oups-duplicates. You'll get a message saying that some files are duplicated. If you don't try again from step #4 a few times. You'll only get duplicates if you get the time out message before the update-something process is still running. If you try to save the file many times, you'll get more duplicates, 4 or 6.
We're not sure if it's a BaseX bug or if we are setting our user management and/or locking rules incorrectly.
Do you have any suggestions?
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
Dear France,
A first update:
I noticed that the oXygen file access while updating the database causes various exceptions (which are written to the BaseX logs). As a result, I also get duplicate files in the database. I will try to find out if this is something we can resolve, or if it goes back to the Milton WebDAV library we use.
A minor info: You can speed up the duplicates lookup by using group by:
let $duplicates := ( for $file-group in db:list('mydb') group by $path := string($file-group) let $count := count($file-group) where $count > 1 return <li>There are { $count } instances of { $path }.</li> ) return if ($duplicates) then <ul> { $duplicates }</ul> else <p>All is good. No duplicate found.</p>
Apart from that, I noticed that it takes a very long time to list the 50.000 files in oXygen. Yet another issues that may be due to the restrictions of WebDAV; but I’ll see if something can be done in BaseX to get this accelerated.
Best, Christian
On Sun, May 13, 2018 at 3:36 PM, France Baril france.baril@architextus.com wrote:
Hi,
Just wondering if this slipped through the cracks.
On Wed, May 2, 2018 at 1:11 PM, France Baril france.baril@architextus.com wrote:
Hi,
We've been having this issue for a while and we think resolving it may be the key to resolving an intermittent server 500 error that we've been having.
When a user tries to save a file while a batch process is running, BaseX saves duplicates of the file.
How to reproduce:
- Take a fresh BaseX 9.0.1 installation
- Copy the attached .xqm in webapp
- Create an empty DB called mydb
- Access localhost:port-num/test/create-update-a-lot-of-files to populate
your db. 5) In OxygenXML, set a webdav connection to the db and open a file, add a character in one of the elements, but don't save the file. 6) From the browser, access 'localhost:port-num/test/update-something' 7) While the process in the browser is still running, save the file in Oxygen. You'll get a message saying that read timed out. Click ok and do not try saving the file again. 8) When the update-something process is done running, don't resave the file in Oxygen, instead go to localhost:port-num/test/oups-duplicates. You'll get a message saying that some files are duplicated. If you don't try again from step #4 a few times. You'll only get duplicates if you get the time out message before the update-something process is still running. If you try to save the file many times, you'll get more duplicates, 4 or 6.
We're not sure if it's a BaseX bug or if we are setting our user management and/or locking rules incorrectly.
Do you have any suggestions?
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
Perfect, happy we could finally find a way to get this issue replicated!.
We only have one DB with more than 1000 files, so display speed is not much of an issue. We'd be happy to get speed anyway :-). The 50,000 files were just to make the bug easy to replicate.
On Wed, May 16, 2018 at 4:04 PM, Christian Grün christian.gruen@gmail.com wrote:
Dear France,
A first update:
I noticed that the oXygen file access while updating the database causes various exceptions (which are written to the BaseX logs). As a result, I also get duplicate files in the database. I will try to find out if this is something we can resolve, or if it goes back to the Milton WebDAV library we use.
A minor info: You can speed up the duplicates lookup by using group by:
let $duplicates := ( for $file-group in db:list('mydb') group by $path := string($file-group) let $count := count($file-group) where $count > 1 return <li>There are { $count } instances of { $path }.</li> ) return if ($duplicates) then <ul> { $duplicates }</ul> else <p>All is good. No duplicate found.</p>
Apart from that, I noticed that it takes a very long time to list the 50.000 files in oXygen. Yet another issues that may be due to the restrictions of WebDAV; but I’ll see if something can be done in BaseX to get this accelerated.
Best, Christian
On Sun, May 13, 2018 at 3:36 PM, France Baril france.baril@architextus.com wrote:
Hi,
Just wondering if this slipped through the cracks.
On Wed, May 2, 2018 at 1:11 PM, France Baril <
france.baril@architextus.com>
wrote:
Hi,
We've been having this issue for a while and we think resolving it may
be
the key to resolving an intermittent server 500 error that we've been having.
When a user tries to save a file while a batch process is running, BaseX saves duplicates of the file.
How to reproduce:
- Take a fresh BaseX 9.0.1 installation
- Copy the attached .xqm in webapp
- Create an empty DB called mydb
- Access localhost:port-num/test/create-update-a-lot-of-files to
populate
your db. 5) In OxygenXML, set a webdav connection to the db and open a file, add
a
character in one of the elements, but don't save the file. 6) From the browser, access 'localhost:port-num/test/update-something' 7) While the process in the browser is still running, save the file in Oxygen. You'll get a message saying that read timed out. Click ok and
do not
try saving the file again. 8) When the update-something process is done running, don't resave the file in Oxygen, instead go to localhost:port-num/test/oups-duplicates. You'll get a message saying that some files are duplicated. If you don't try again from step #4 a few times. You'll only get duplicates if
you
get the time out message before the update-something process is still running. If you try to save the file many times, you'll get more
duplicates,
4 or 6.
We're not sure if it's a BaseX bug or if we are setting our user management and/or locking rules incorrectly.
Do you have any suggestions?
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
Hi France,
The delay for retrieving the file list seems to be oXygen-specific: BaseX itself requires appr. 1 second to create a list of the 50.000 files, but it takes around 180 seconds until the resources are displayed in the oXygen WebDAV explorer. I tried another WebDAV implementation: With the WebDAV plugin of the windows application TotalCommander, the files are listed after 3 seconds.
But back to your original question: My troubles started when I tried to open and close a file with oXygen (version 20): If I open a single resource, a NullPointerException is output by BaseX (on command line). If I close the file and try to reopen it, oXygen returns 500 (“Problem while trying to acquire lock”).
Do you experience a similar behavior? Which versions of BaseX and oXygen are you currently working with?
Unfortunately, the WebDAV protocol has been causing problems since the very beginning we implemented it. This is on the one hand due to the outdated library we use, on the other hand to the protocol itself (each WebDAV client seems to use it differently). Maybe you could have a look at Axxepta’s Argon Author plugin for oXygen:
Best, Christian
On Thu, May 17, 2018 at 10:55 AM, France Baril france.baril@architextus.com wrote:
Perfect, happy we could finally find a way to get this issue replicated!.
We only have one DB with more than 1000 files, so display speed is not much of an issue. We'd be happy to get speed anyway :-). The 50,000 files were just to make the bug easy to replicate.
On Wed, May 16, 2018 at 4:04 PM, Christian Grün christian.gruen@gmail.com wrote:
Dear France,
A first update:
I noticed that the oXygen file access while updating the database causes various exceptions (which are written to the BaseX logs). As a result, I also get duplicate files in the database. I will try to find out if this is something we can resolve, or if it goes back to the Milton WebDAV library we use.
A minor info: You can speed up the duplicates lookup by using group by:
let $duplicates := ( for $file-group in db:list('mydb') group by $path := string($file-group) let $count := count($file-group) where $count > 1 return <li>There are { $count } instances of { $path }.</li> ) return if ($duplicates) then <ul> { $duplicates }</ul> else <p>All is good. No duplicate found.</p>
Apart from that, I noticed that it takes a very long time to list the 50.000 files in oXygen. Yet another issues that may be due to the restrictions of WebDAV; but I’ll see if something can be done in BaseX to get this accelerated.
Best, Christian
On Sun, May 13, 2018 at 3:36 PM, France Baril france.baril@architextus.com wrote:
Hi,
Just wondering if this slipped through the cracks.
On Wed, May 2, 2018 at 1:11 PM, France Baril france.baril@architextus.com wrote:
Hi,
We've been having this issue for a while and we think resolving it may be the key to resolving an intermittent server 500 error that we've been having.
When a user tries to save a file while a batch process is running, BaseX saves duplicates of the file.
How to reproduce:
- Take a fresh BaseX 9.0.1 installation
- Copy the attached .xqm in webapp
- Create an empty DB called mydb
- Access localhost:port-num/test/create-update-a-lot-of-files to
populate your db. 5) In OxygenXML, set a webdav connection to the db and open a file, add a character in one of the elements, but don't save the file. 6) From the browser, access 'localhost:port-num/test/update-something' 7) While the process in the browser is still running, save the file in Oxygen. You'll get a message saying that read timed out. Click ok and do not try saving the file again. 8) When the update-something process is done running, don't resave the file in Oxygen, instead go to localhost:port-num/test/oups-duplicates. You'll get a message saying that some files are duplicated. If you don't try again from step #4 a few times. You'll only get duplicates if you get the time out message before the update-something process is still running. If you try to save the file many times, you'll get more duplicates, 4 or 6.
We're not sure if it's a BaseX bug or if we are setting our user management and/or locking rules incorrectly.
Do you have any suggestions?
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
Hi France,
Some updates:
• I fixed the locking bug that caused a null pointer exception.
• As you probably know, the WebDAV locks were organized in an additional ~webdav database on disk. I decided to change this quite fundamentally: From now on, the locks will be kept in main-memory. Locks will get lost if BaseX is restarted (but I expect this to rarely happen in productive environments).
• The good news is that the oXygen WebDAV explorer will be much faster now! I noticed that 50.000 internal log checks were performed with oXygen. This didn’t happen with other WebDAV clients.
I’d be pleased if you could check out the latest snapshot [1] and give me an update if it works as expected. The actual problem you reported has not been fixed yet, but I’m positive that things are clearing up.
Best, Christian
[1] http://files.basex.org/releases/latest/
On Thu, May 17, 2018 at 5:42 PM, Christian Grün christian.gruen@gmail.com wrote:
Hi France,
The delay for retrieving the file list seems to be oXygen-specific: BaseX itself requires appr. 1 second to create a list of the 50.000 files, but it takes around 180 seconds until the resources are displayed in the oXygen WebDAV explorer. I tried another WebDAV implementation: With the WebDAV plugin of the windows application TotalCommander, the files are listed after 3 seconds.
But back to your original question: My troubles started when I tried to open and close a file with oXygen (version 20): If I open a single resource, a NullPointerException is output by BaseX (on command line). If I close the file and try to reopen it, oXygen returns 500 (“Problem while trying to acquire lock”).
Do you experience a similar behavior? Which versions of BaseX and oXygen are you currently working with?
Unfortunately, the WebDAV protocol has been causing problems since the very beginning we implemented it. This is on the one hand due to the outdated library we use, on the other hand to the protocol itself (each WebDAV client seems to use it differently). Maybe you could have a look at Axxepta’s Argon Author plugin for oXygen:
Best, Christian
On Thu, May 17, 2018 at 10:55 AM, France Baril france.baril@architextus.com wrote:
Perfect, happy we could finally find a way to get this issue replicated!.
We only have one DB with more than 1000 files, so display speed is not much of an issue. We'd be happy to get speed anyway :-). The 50,000 files were just to make the bug easy to replicate.
On Wed, May 16, 2018 at 4:04 PM, Christian Grün christian.gruen@gmail.com wrote:
Dear France,
A first update:
I noticed that the oXygen file access while updating the database causes various exceptions (which are written to the BaseX logs). As a result, I also get duplicate files in the database. I will try to find out if this is something we can resolve, or if it goes back to thev Milton WebDAV library we use.
A minor info: You can speed up the duplicates lookup by using group by:
let $duplicates := ( for $file-group in db:list('mydb') group by $path := string($file-group) let $count := count($file-group) where $count > 1 return <li>There are { $count } instances of { $path }.</li> ) return if ($duplicates) then <ul> { $duplicates }</ul> else <p>All is good. No duplicate found.</p>
Apart from that, I noticed that it takes a very long time to list the 50.000 files in oXygen. Yet another issues that may be due to the restrictions of WebDAV; but I’ll see if something can be done in BaseX to get this accelerated.
Best, Christian
On Sun, May 13, 2018 at 3:36 PM, France Baril france.baril@architextus.com wrote:
Hi,
Just wondering if this slipped through the cracks.
On Wed, May 2, 2018 at 1:11 PM, France Baril france.baril@architextus.com wrote:
Hi,
We've been having this issue for a while and we think resolving it may be the key to resolving an intermittent server 500 error that we've been having.
When a user tries to save a file while a batch process is running, BaseX saves duplicates of the file.
How to reproduce:
- Take a fresh BaseX 9.0.1 installation
- Copy the attached .xqm in webapp
- Create an empty DB called mydb
- Access localhost:port-num/test/create-update-a-lot-of-files to
populate your db. 5) In OxygenXML, set a webdav connection to the db and open a file, add a character in one of the elements, but don't save the file. 6) From the browser, access 'localhost:port-num/test/update-something' 7) While the process in the browser is still running, save the file in Oxygen. You'll get a message saying that read timed out. Click ok and do not try saving the file again. 8) When the update-something process is done running, don't resave the file in Oxygen, instead go to localhost:port-num/test/oups-duplicates. You'll get a message saying that some files are duplicated. If you don't try again from step #4 a few times. You'll only get duplicates if you get the time out message before the update-something process is still running. If you try to save the file many times, you'll get more duplicates, 4 or 6.
We're not sure if it's a BaseX bug or if we are setting our user management and/or locking rules incorrectly.
Do you have any suggestions?
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
Awesome, any progress is good. We've been dealing with that lock issue for the longest time, but it took a while to figure out how to replicate it systematically. It was always so random. I'll see with the client when we can upgrade and will let you know how it goes.
On Thu, May 17, 2018 at 9:05 PM, Christian Grün christian.gruen@gmail.com wrote:
Hi France,
Some updates:
• I fixed the locking bug that caused a null pointer exception.
• As you probably know, the WebDAV locks were organized in an additional ~webdav database on disk. I decided to change this quite fundamentally: From now on, the locks will be kept in main-memory. Locks will get lost if BaseX is restarted (but I expect this to rarely happen in productive environments).
• The good news is that the oXygen WebDAV explorer will be much faster now! I noticed that 50.000 internal log checks were performed with oXygen. This didn’t happen with other WebDAV clients.
I’d be pleased if you could check out the latest snapshot [1] and give me an update if it works as expected. The actual problem you reported has not been fixed yet, but I’m positive that things are clearing up.
Best, Christian
[1] http://files.basex.org/releases/latest/
On Thu, May 17, 2018 at 5:42 PM, Christian Grün christian.gruen@gmail.com wrote:
Hi France,
The delay for retrieving the file list seems to be oXygen-specific: BaseX itself requires appr. 1 second to create a list of the 50.000 files, but it takes around 180 seconds until the resources are displayed in the oXygen WebDAV explorer. I tried another WebDAV implementation: With the WebDAV plugin of the windows application TotalCommander, the files are listed after 3 seconds.
But back to your original question: My troubles started when I tried to open and close a file with oXygen (version 20): If I open a single resource, a NullPointerException is output by BaseX (on command line). If I close the file and try to reopen it, oXygen returns 500 (“Problem while trying to acquire lock”).
Do you experience a similar behavior? Which versions of BaseX and oXygen are you currently working with?
Unfortunately, the WebDAV protocol has been causing problems since the very beginning we implemented it. This is on the one hand due to the outdated library we use, on the other hand to the protocol itself (each WebDAV client seems to use it differently). Maybe you could have a look at Axxepta’s Argon Author plugin for oXygen:
Best, Christian
On Thu, May 17, 2018 at 10:55 AM, France Baril france.baril@architextus.com wrote:
Perfect, happy we could finally find a way to get this issue
replicated!.
We only have one DB with more than 1000 files, so display speed is not
much
of an issue. We'd be happy to get speed anyway :-). The 50,000 files
were
just to make the bug easy to replicate.
On Wed, May 16, 2018 at 4:04 PM, Christian Grün <
christian.gruen@gmail.com>
wrote:
Dear France,
A first update:
I noticed that the oXygen file access while updating the database causes various exceptions (which are written to the BaseX logs). As a result, I also get duplicate files in the database. I will try to find out if this is something we can resolve, or if it goes back to thev Milton WebDAV library we use.
A minor info: You can speed up the duplicates lookup by using group by:
let $duplicates := ( for $file-group in db:list('mydb') group by $path := string($file-group) let $count := count($file-group) where $count > 1 return <li>There are { $count } instances of { $path }.</li> ) return if ($duplicates) then <ul> { $duplicates }</ul> else <p>All is good. No duplicate found.</p>
Apart from that, I noticed that it takes a very long time to list the 50.000 files in oXygen. Yet another issues that may be due to the restrictions of WebDAV; but I’ll see if something can be done in BaseX to get this accelerated.
Best, Christian
On Sun, May 13, 2018 at 3:36 PM, France Baril france.baril@architextus.com wrote:
Hi,
Just wondering if this slipped through the cracks.
On Wed, May 2, 2018 at 1:11 PM, France Baril france.baril@architextus.com wrote:
Hi,
We've been having this issue for a while and we think resolving it
may
be the key to resolving an intermittent server 500 error that we've
been
having.
When a user tries to save a file while a batch process is running, BaseX saves duplicates of the file.
How to reproduce:
- Take a fresh BaseX 9.0.1 installation
- Copy the attached .xqm in webapp
- Create an empty DB called mydb
- Access localhost:port-num/test/create-update-a-lot-of-files to
populate your db. 5) In OxygenXML, set a webdav connection to the db and open a file,
add
a character in one of the elements, but don't save the file. 6) From the browser, access 'localhost:port-num/test/
update-something'
- While the process in the browser is still running, save the file
in
Oxygen. You'll get a message saying that read timed out. Click ok
and
do not try saving the file again. 8) When the update-something process is done running, don't resave
the
file in Oxygen, instead go to localhost:port-num/test/oups-
duplicates.
You'll get a message saying that some files are duplicated. If
you
don't try again from step #4 a few times. You'll only get
duplicates if
you get the time out message before the update-something process is
still
running. If you try to save the file many times, you'll get more duplicates, 4 or 6.
We're not sure if it's a BaseX bug or if we are setting our user management and/or locking rules incorrectly.
Do you have any suggestions?
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
I believe we have fixed the reported issue. Furthermore, we didn’t manage to trigger any of the exceptions (EOF, etc.) anymore. The new snapshot is available.
Cheers, Christian
On Fri, May 18, 2018 at 2:08 PM, France Baril france.baril@architextus.com wrote:
Awesome, any progress is good. We've been dealing with that lock issue for the longest time, but it took a while to figure out how to replicate it systematically. It was always so random. I'll see with the client when we can upgrade and will let you know how it goes.
On Thu, May 17, 2018 at 9:05 PM, Christian Grün christian.gruen@gmail.com wrote:
Hi France,
Some updates:
• I fixed the locking bug that caused a null pointer exception.
• As you probably know, the WebDAV locks were organized in an additional ~webdav database on disk. I decided to change this quite fundamentally: From now on, the locks will be kept in main-memory. Locks will get lost if BaseX is restarted (but I expect this to rarely happen in productive environments).
• The good news is that the oXygen WebDAV explorer will be much faster now! I noticed that 50.000 internal log checks were performed with oXygen. This didn’t happen with other WebDAV clients.
I’d be pleased if you could check out the latest snapshot [1] and give me an update if it works as expected. The actual problem you reported has not been fixed yet, but I’m positive that things are clearing up.
Best, Christian
[1] http://files.basex.org/releases/latest/
On Thu, May 17, 2018 at 5:42 PM, Christian Grün christian.gruen@gmail.com wrote:
Hi France,
The delay for retrieving the file list seems to be oXygen-specific: BaseX itself requires appr. 1 second to create a list of the 50.000 files, but it takes around 180 seconds until the resources are displayed in the oXygen WebDAV explorer. I tried another WebDAV implementation: With the WebDAV plugin of the windows application TotalCommander, the files are listed after 3 seconds.
But back to your original question: My troubles started when I tried to open and close a file with oXygen (version 20): If I open a single resource, a NullPointerException is output by BaseX (on command line). If I close the file and try to reopen it, oXygen returns 500 (“Problem while trying to acquire lock”).
Do you experience a similar behavior? Which versions of BaseX and oXygen are you currently working with?
Unfortunately, the WebDAV protocol has been causing problems since the very beginning we implemented it. This is on the one hand due to the outdated library we use, on the other hand to the protocol itself (each WebDAV client seems to use it differently). Maybe you could have a look at Axxepta’s Argon Author plugin for oXygen:
Best, Christian
On Thu, May 17, 2018 at 10:55 AM, France Baril france.baril@architextus.com wrote:
Perfect, happy we could finally find a way to get this issue replicated!.
We only have one DB with more than 1000 files, so display speed is not much of an issue. We'd be happy to get speed anyway :-). The 50,000 files were just to make the bug easy to replicate.
On Wed, May 16, 2018 at 4:04 PM, Christian Grün christian.gruen@gmail.com wrote:
Dear France,
A first update:
I noticed that the oXygen file access while updating the database causes various exceptions (which are written to the BaseX logs). As a result, I also get duplicate files in the database. I will try to find out if this is something we can resolve, or if it goes back to thev Milton WebDAV library we use.
A minor info: You can speed up the duplicates lookup by using group by:
let $duplicates := ( for $file-group in db:list('mydb') group by $path := string($file-group) let $count := count($file-group) where $count > 1 return <li>There are { $count } instances of { $path }.</li> ) return if ($duplicates) then <ul> { $duplicates }</ul> else <p>All is good. No duplicate found.</p>
Apart from that, I noticed that it takes a very long time to list the 50.000 files in oXygen. Yet another issues that may be due to the restrictions of WebDAV; but I’ll see if something can be done in BaseX to get this accelerated.
Best, Christian
On Sun, May 13, 2018 at 3:36 PM, France Baril france.baril@architextus.com wrote:
Hi,
Just wondering if this slipped through the cracks.
On Wed, May 2, 2018 at 1:11 PM, France Baril france.baril@architextus.com wrote: > > Hi, > > We've been having this issue for a while and we think resolving it > may > be > the key to resolving an intermittent server 500 error that we've > been > having. > > When a user tries to save a file while a batch process is running, > BaseX > saves duplicates of the file. > > How to reproduce: > > 1) Take a fresh BaseX 9.0.1 installation > 2) Copy the attached .xqm in webapp > 3) Create an empty DB called mydb > 4) Access localhost:port-num/test/create-update-a-lot-of-files to > populate > your db. > 5) In OxygenXML, set a webdav connection to the db and open a file, > add > a > character in one of the elements, but don't save the file. > 6) From the browser, access > 'localhost:port-num/test/update-something' > 7) While the process in the browser is still running, save the file > in > Oxygen. You'll get a message saying that read timed out. Click ok > and > do not > try saving the file again. > 8) When the update-something process is done running, don't resave > the > file in Oxygen, instead go to > localhost:port-num/test/oups-duplicates. > You'll get a message saying that some files are duplicated. If > you > don't try again from step #4 a few times. You'll only get > duplicates if > you > get the time out message before the update-something process is > still > running. If you try to save the file many times, you'll get more > duplicates, > 4 or 6. > > We're not sure if it's a BaseX bug or if we are setting our user > management and/or locking rules incorrectly. > > Do you have any suggestions? > > -- > France Baril > Architecte documentaire / Documentation architect > france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
Hi,
The first round of tests looks good. We'll try it in production and see how it responds with multiple users.
The only issue (warning not error) I got was when trying to open a file if the server is busy with a process. The message is :
[qtp1635546341-17] WARN com.bradmcevoy.http.LockInfo - resource is being locked with a null user. This won't really be locked at all...
org.basex.query.QueryIOException: Cannot convert empty-sequence() to xs:string: $owner := ().
at org.basex.http.webdav.WebDAVQuery.execute(WebDAVQuery.java:122)
at org.basex.http.webdav.WebDAVLockService.lock(WebDAVLockService.java:74)
at org.basex.http.webdav.WebDAVResource$4.get(WebDAVResource.java:138)
at org.basex.http.webdav.WebDAVResource$4.get(WebDAVResource.java:1)
at org.basex.http.webdav.WebDAVCode.eval(WebDAVCode.java:37)
at org.basex.http.webdav.WebDAVCode.evalNoEx(WebDAVCode.java:53)
at org.basex.http.webdav.WebDAVResource.lock(WebDAVResource.java:153)
at com.bradmcevoy.http.webdav.LockHandler.processNewLock(LockHandler.java:225)
at com.bradmcevoy.http.webdav.LockHandler.processExistingResource(LockHandler.java:110)
at com.bradmcevoy.http.webdav.LockHandler.process(LockHandler.java:86)
at com.bradmcevoy.http.StandardFilter.process(StandardFilter.java:52)
at com.bradmcevoy.http.FilterChain.process(FilterChain.java:40)
at com.bradmcevoy.http.HttpManager.process(HttpManager.java:228)
at org.basex.http.webdav.WebDAVServlet.run(WebDAVServlet.java:34)
at org.basex.http.BaseXServlet.service(BaseXServlet.java:59)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:856)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:535)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1253)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1155)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.Server.handle(Server.java:531)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:352)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:281)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:102)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:319)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:175)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:133)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:754)
at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:672)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.basex.query.QueryException: Cannot convert empty-sequence() to xs:string: $owner := ().
at org.basex.query.QueryError.get(QueryError.java:1392)
at org.basex.query.QueryError.typeError(QueryError.java:1575)
at org.basex.query.value.type.SeqType.promote(SeqType.java:381)
at org.basex.query.var.Var.checkType(Var.java:191)
at org.basex.query.expr.gflwor.Let.optimize(Let.java:94)
at org.basex.query.func.StaticFunc.inlineExpr(StaticFunc.java:286)
at org.basex.query.func.StaticFuncCall.compile(StaticFuncCall.java:68)
at org.basex.query.scope.MainModule.comp(MainModule.java:83)
at org.basex.query.QueryCompiler.compile(QueryCompiler.java:114)
at org.basex.query.QueryCompiler.compile(QueryCompiler.java:105)
at org.basex.query.QueryContext.compile(QueryContext.java:302)
at org.basex.query.QueryContext.iter(QueryContext.java:321)
at org.basex.query.QueryProcessor.iter(QueryProcessor.java:90)
at org.basex.query.QueryProcessor.value(QueryProcessor.java:99)
at org.basex.http.webdav.WebDAVQuery.execute(WebDAVQuery.java:116)
... 44 more
But no more duplicate and no more lock message.
On Fri, May 18, 2018 at 2:15 PM, Christian Grün christian.gruen@gmail.com wrote:
I believe we have fixed the reported issue. Furthermore, we didn’t manage to trigger any of the exceptions (EOF, etc.) anymore. The new snapshot is available.
Cheers, Christian
On Fri, May 18, 2018 at 2:08 PM, France Baril france.baril@architextus.com wrote:
Awesome, any progress is good. We've been dealing with that lock issue
for
the longest time, but it took a while to figure out how to replicate it systematically. It was always so random. I'll see with the client when we can upgrade and will let you know how it goes.
On Thu, May 17, 2018 at 9:05 PM, Christian Grün <
christian.gruen@gmail.com>
wrote:
Hi France,
Some updates:
• I fixed the locking bug that caused a null pointer exception.
• As you probably know, the WebDAV locks were organized in an additional ~webdav database on disk. I decided to change this quite fundamentally: From now on, the locks will be kept in main-memory. Locks will get lost if BaseX is restarted (but I expect this to rarely happen in productive environments).
• The good news is that the oXygen WebDAV explorer will be much faster now! I noticed that 50.000 internal log checks were performed with oXygen. This didn’t happen with other WebDAV clients.
I’d be pleased if you could check out the latest snapshot [1] and give me an update if it works as expected. The actual problem you reported has not been fixed yet, but I’m positive that things are clearing up.
Best, Christian
[1] http://files.basex.org/releases/latest/
On Thu, May 17, 2018 at 5:42 PM, Christian Grün christian.gruen@gmail.com wrote:
Hi France,
The delay for retrieving the file list seems to be oXygen-specific: BaseX itself requires appr. 1 second to create a list of the 50.000 files, but it takes around 180 seconds until the resources are displayed in the oXygen WebDAV explorer. I tried another WebDAV implementation: With the WebDAV plugin of the windows application TotalCommander, the files are listed after 3 seconds.
But back to your original question: My troubles started when I tried to open and close a file with oXygen (version 20): If I open a single resource, a NullPointerException is output by BaseX (on command line). If I close the file and try to reopen it, oXygen returns 500 (“Problem while trying to acquire lock”).
Do you experience a similar behavior? Which versions of BaseX and oXygen are you currently working with?
Unfortunately, the WebDAV protocol has been causing problems since the very beginning we implemented it. This is on the one hand due to the outdated library we use, on the other hand to the protocol itself (each WebDAV client seems to use it differently). Maybe you could have a look at Axxepta’s Argon Author plugin for oXygen:
Best, Christian
On Thu, May 17, 2018 at 10:55 AM, France Baril france.baril@architextus.com wrote:
Perfect, happy we could finally find a way to get this issue replicated!.
We only have one DB with more than 1000 files, so display speed is
not
much of an issue. We'd be happy to get speed anyway :-). The 50,000 files were just to make the bug easy to replicate.
On Wed, May 16, 2018 at 4:04 PM, Christian Grün christian.gruen@gmail.com wrote:
Dear France,
A first update:
I noticed that the oXygen file access while updating the database causes various exceptions (which are written to the BaseX logs). As
a
result, I also get duplicate files in the database. I will try to
find
out if this is something we can resolve, or if it goes back to thev Milton WebDAV library we use.
A minor info: You can speed up the duplicates lookup by using group by:
let $duplicates := ( for $file-group in db:list('mydb') group by $path := string($file-group) let $count := count($file-group) where $count > 1 return <li>There are { $count } instances of { $path }.</li> ) return if ($duplicates) then <ul> { $duplicates }</ul> else <p>All is good. No duplicate found.</p>
Apart from that, I noticed that it takes a very long time to list
the
50.000 files in oXygen. Yet another issues that may be due to the restrictions of WebDAV; but I’ll see if something can be done in
BaseX
to get this accelerated.
Best, Christian
On Sun, May 13, 2018 at 3:36 PM, France Baril france.baril@architextus.com wrote: > Hi, > > Just wondering if this slipped through the cracks. > > On Wed, May 2, 2018 at 1:11 PM, France Baril > france.baril@architextus.com > wrote: >> >> Hi, >> >> We've been having this issue for a while and we think resolving
it
>> may >> be >> the key to resolving an intermittent server 500 error that we've >> been >> having. >> >> When a user tries to save a file while a batch process is
running,
>> BaseX >> saves duplicates of the file. >> >> How to reproduce: >> >> 1) Take a fresh BaseX 9.0.1 installation >> 2) Copy the attached .xqm in webapp >> 3) Create an empty DB called mydb >> 4) Access localhost:port-num/test/create-update-a-lot-of-files
to
>> populate >> your db. >> 5) In OxygenXML, set a webdav connection to the db and open a
file,
>> add >> a >> character in one of the elements, but don't save the file. >> 6) From the browser, access >> 'localhost:port-num/test/update-something' >> 7) While the process in the browser is still running, save the
file
>> in >> Oxygen. You'll get a message saying that read timed out. Click ok >> and >> do not >> try saving the file again. >> 8) When the update-something process is done running, don't
resave
>> the >> file in Oxygen, instead go to >> localhost:port-num/test/oups-duplicates. >> You'll get a message saying that some files are duplicated. If >> you >> don't try again from step #4 a few times. You'll only get >> duplicates if >> you >> get the time out message before the update-something process is >> still >> running. If you try to save the file many times, you'll get more >> duplicates, >> 4 or 6. >> >> We're not sure if it's a BaseX bug or if we are setting our user >> management and/or locking rules incorrectly. >> >> Do you have any suggestions? >> >> -- >> France Baril >> Architecte documentaire / Documentation architect >> france.baril@architextus.com > > > > > -- > France Baril > Architecte documentaire / Documentation architect > france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
Hi France,
I have finalized the WebDAV rewritings. The warning you observed shouldn’t occur anymore (the same applies to the returned stack trace, which was a consequence of the default user not being assigned to the internal locks).
Have fun while testing, Christian
[1] http://files.basex.org/releases/latest/
On Mon, May 21, 2018 at 5:52 PM, France Baril france.baril@architextus.com wrote:
Hi,
The first round of tests looks good. We'll try it in production and see how it responds with multiple users.
The only issue (warning not error) I got was when trying to open a file if the server is busy with a process. The message is :
[qtp1635546341-17] WARN com.bradmcevoy.http.LockInfo - resource is being locked with a null user. This won't really be locked at all...
org.basex.query.QueryIOException: Cannot convert empty-sequence() to xs:string: $owner := ().
at org.basex.http.webdav.WebDAVQuery.execute(WebDAVQuery.java:122)
at org.basex.http.webdav.WebDAVLockService.lock(WebDAVLockService.java:74)
at org.basex.http.webdav.WebDAVResource$4.get(WebDAVResource.java:138)
at org.basex.http.webdav.WebDAVResource$4.get(WebDAVResource.java:1)
at org.basex.http.webdav.WebDAVCode.eval(WebDAVCode.java:37)
at org.basex.http.webdav.WebDAVCode.evalNoEx(WebDAVCode.java:53)
at org.basex.http.webdav.WebDAVResource.lock(WebDAVResource.java:153)
at com.bradmcevoy.http.webdav.LockHandler.processNewLock(LockHandler.java:225)
at com.bradmcevoy.http.webdav.LockHandler.processExistingResource(LockHandler.java:110)
at com.bradmcevoy.http.webdav.LockHandler.process(LockHandler.java:86)
at com.bradmcevoy.http.StandardFilter.process(StandardFilter.java:52)
at com.bradmcevoy.http.FilterChain.process(FilterChain.java:40)
at com.bradmcevoy.http.HttpManager.process(HttpManager.java:228)
at org.basex.http.webdav.WebDAVServlet.run(WebDAVServlet.java:34)
at org.basex.http.BaseXServlet.service(BaseXServlet.java:59)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:856)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:535)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1595)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1253)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1564)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1155)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.Server.handle(Server.java:531)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:352)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:281)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:102)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:319)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:175)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:133)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:754)
at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:672)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.basex.query.QueryException: Cannot convert empty-sequence() to xs:string: $owner := ().
at org.basex.query.QueryError.get(QueryError.java:1392)
at org.basex.query.QueryError.typeError(QueryError.java:1575)
at org.basex.query.value.type.SeqType.promote(SeqType.java:381)
at org.basex.query.var.Var.checkType(Var.java:191)
at org.basex.query.expr.gflwor.Let.optimize(Let.java:94)
at org.basex.query.func.StaticFunc.inlineExpr(StaticFunc.java:286)
at org.basex.query.func.StaticFuncCall.compile(StaticFuncCall.java:68)
at org.basex.query.scope.MainModule.comp(MainModule.java:83)
at org.basex.query.QueryCompiler.compile(QueryCompiler.java:114)
at org.basex.query.QueryCompiler.compile(QueryCompiler.java:105)
at org.basex.query.QueryContext.compile(QueryContext.java:302)
at org.basex.query.QueryContext.iter(QueryContext.java:321)
at org.basex.query.QueryProcessor.iter(QueryProcessor.java:90)
at org.basex.query.QueryProcessor.value(QueryProcessor.java:99)
at org.basex.http.webdav.WebDAVQuery.execute(WebDAVQuery.java:116)
... 44 more
But no more duplicate and no more lock message.
On Fri, May 18, 2018 at 2:15 PM, Christian Grün christian.gruen@gmail.com wrote:
I believe we have fixed the reported issue. Furthermore, we didn’t manage to trigger any of the exceptions (EOF, etc.) anymore. The new snapshot is available.
Cheers, Christian
On Fri, May 18, 2018 at 2:08 PM, France Baril france.baril@architextus.com wrote:
Awesome, any progress is good. We've been dealing with that lock issue for the longest time, but it took a while to figure out how to replicate it systematically. It was always so random. I'll see with the client when we can upgrade and will let you know how it goes.
On Thu, May 17, 2018 at 9:05 PM, Christian Grün christian.gruen@gmail.com wrote:
Hi France,
Some updates:
• I fixed the locking bug that caused a null pointer exception.
• As you probably know, the WebDAV locks were organized in an additional ~webdav database on disk. I decided to change this quite fundamentally: From now on, the locks will be kept in main-memory. Locks will get lost if BaseX is restarted (but I expect this to rarely happen in productive environments).
• The good news is that the oXygen WebDAV explorer will be much faster now! I noticed that 50.000 internal log checks were performed with oXygen. This didn’t happen with other WebDAV clients.
I’d be pleased if you could check out the latest snapshot [1] and give me an update if it works as expected. The actual problem you reported has not been fixed yet, but I’m positive that things are clearing up.
Best, Christian
[1] http://files.basex.org/releases/latest/
On Thu, May 17, 2018 at 5:42 PM, Christian Grün christian.gruen@gmail.com wrote:
Hi France,
The delay for retrieving the file list seems to be oXygen-specific: BaseX itself requires appr. 1 second to create a list of the 50.000 files, but it takes around 180 seconds until the resources are displayed in the oXygen WebDAV explorer. I tried another WebDAV implementation: With the WebDAV plugin of the windows application TotalCommander, the files are listed after 3 seconds.
But back to your original question: My troubles started when I tried to open and close a file with oXygen (version 20): If I open a single resource, a NullPointerException is output by BaseX (on command line). If I close the file and try to reopen it, oXygen returns 500 (“Problem while trying to acquire lock”).
Do you experience a similar behavior? Which versions of BaseX and oXygen are you currently working with?
Unfortunately, the WebDAV protocol has been causing problems since the very beginning we implemented it. This is on the one hand due to the outdated library we use, on the other hand to the protocol itself (each WebDAV client seems to use it differently). Maybe you could have a look at Axxepta’s Argon Author plugin for oXygen:
Best, Christian
On Thu, May 17, 2018 at 10:55 AM, France Baril france.baril@architextus.com wrote:
Perfect, happy we could finally find a way to get this issue replicated!.
We only have one DB with more than 1000 files, so display speed is not much of an issue. We'd be happy to get speed anyway :-). The 50,000 files were just to make the bug easy to replicate.
On Wed, May 16, 2018 at 4:04 PM, Christian Grün christian.gruen@gmail.com wrote: > > Dear France, > > A first update: > > I noticed that the oXygen file access while updating the database > causes various exceptions (which are written to the BaseX logs). As > a > result, I also get duplicate files in the database. I will try to > find > out if this is something we can resolve, or if it goes back to thev > Milton WebDAV library we use. > > A minor info: You can speed up the duplicates lookup by using group > by: > > let $duplicates := ( > for $file-group in db:list('mydb') > group by $path := string($file-group) > let $count := count($file-group) > where $count > 1 > return <li>There are { $count } instances of { $path }.</li> > ) > return > if ($duplicates) > then <ul> { $duplicates }</ul> > else <p>All is good. No duplicate found.</p> > > Apart from that, I noticed that it takes a very long time to list > the > 50.000 files in oXygen. Yet another issues that may be due to the > restrictions of WebDAV; but I’ll see if something can be done in > BaseX > to get this accelerated. > > Best, > Christian > > > > > On Sun, May 13, 2018 at 3:36 PM, France Baril > france.baril@architextus.com wrote: > > Hi, > > > > Just wondering if this slipped through the cracks. > > > > On Wed, May 2, 2018 at 1:11 PM, France Baril > > france.baril@architextus.com > > wrote: > >> > >> Hi, > >> > >> We've been having this issue for a while and we think resolving > >> it > >> may > >> be > >> the key to resolving an intermittent server 500 error that we've > >> been > >> having. > >> > >> When a user tries to save a file while a batch process is > >> running, > >> BaseX > >> saves duplicates of the file. > >> > >> How to reproduce: > >> > >> 1) Take a fresh BaseX 9.0.1 installation > >> 2) Copy the attached .xqm in webapp > >> 3) Create an empty DB called mydb > >> 4) Access localhost:port-num/test/create-update-a-lot-of-files > >> to > >> populate > >> your db. > >> 5) In OxygenXML, set a webdav connection to the db and open a > >> file, > >> add > >> a > >> character in one of the elements, but don't save the file. > >> 6) From the browser, access > >> 'localhost:port-num/test/update-something' > >> 7) While the process in the browser is still running, save the > >> file > >> in > >> Oxygen. You'll get a message saying that read timed out. Click > >> ok > >> and > >> do not > >> try saving the file again. > >> 8) When the update-something process is done running, don't > >> resave > >> the > >> file in Oxygen, instead go to > >> localhost:port-num/test/oups-duplicates. > >> You'll get a message saying that some files are duplicated. > >> If > >> you > >> don't try again from step #4 a few times. You'll only get > >> duplicates if > >> you > >> get the time out message before the update-something process is > >> still > >> running. If you try to save the file many times, you'll get more > >> duplicates, > >> 4 or 6. > >> > >> We're not sure if it's a BaseX bug or if we are setting our user > >> management and/or locking rules incorrectly. > >> > >> Do you have any suggestions? > >> > >> -- > >> France Baril > >> Architecte documentaire / Documentation architect > >> france.baril@architextus.com > > > > > > > > > > -- > > France Baril > > Architecte documentaire / Documentation architect > > france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
-- France Baril Architecte documentaire / Documentation architect france.baril@architextus.com
basex-talk@mailman.uni-konstanz.de