Hello Hans-Juergen,
here are some details about my use case, which is similar to yours.
I'm using BaseX to insert the live public Twitter Stream into databases (see Wiki Entry [1]).
One Twitter message is around 4 kb of size and i'm able to insert about 2000 of them per second
using single XQuery Update inserts. So that would probably be working out for you, too.
If you use bulk inserts, like caching the items in a item list and running one XQuery Update for all of them, the amount of inserts would also increase.
thus made available for querying
this could be a bigger problem, cause as long as you are writing items into the database (which will never stop in your use case), the readers are blocked.
And if one of your readers will be running, the writers are blocked.
Hope this helps,
Andreas
Am 03.07.2012 um 15:30 schrieb Hans-Juergen Rennau:
Hello,
this is a general question as to whether in a given scenario BaseX might be an appropriate instrument.
Per second approximately 1000 XML log messages must be stored and thus made available for querying. The messages are expected to be <= 1MB.
The log messages may be sent by any number of clients simultaneously. The clients are probably not able to specify unique document URIs, as they are working independently of each other. So the database would have to create unique URIs - perhaps concatenating a semantic part supplied by the client and a generated unique identifier - as part of the storage processing.
Would this be possible already now? If not, perhaps in the near future?
Thank you,
kind regards,
Hans-Juergen
_______________________________________________
BaseX-Talk mailing list
BaseX-Talk@mailman.uni-konstanz.de
https://mailman.uni-konstanz.de/mailman/listinfo/basex-talk