Hello List I am wanting to do a join with some large (3-400Mb) XML files and would appreciate guidance on the optimal strategy. At present these files are on the filesystem and not in a database
Is there any equivalent to the Zorba streaming xml:parse()?
Would loading the files into a database directly be the approach, or is it better to split them into smaller files?
Is the file: module a suitable route through which to import the files?
Thanks for your help
Peter
Dear Peter,
Did you try to create a collection with the files (CREATE command) ? You should start that way, I don't see the point in using file: module for import. I think that once in the database, file size does not matter (until you reach millions of file in the collection, and do a lot of document related operations (list, etc...))
-----Message d'origine----- De : basex-talk-bounces@mailman.uni-konstanz.de [mailto:basex-talk-bounces@mailman.uni-konstanz.de] De la part de pw@themail.co.uk Envoyé : lundi 11 février 2013 15:33 À : BaseX-Talk@mailman.uni-konstanz.de Objet : [basex-talk] handling large files: is there a streaming solution?
Hello List I am wanting to do a join with some large (3-400Mb) XML files and would appreciate guidance on the optimal strategy. At present these files are on the filesystem and not in a database
Is there any equivalent to the Zorba streaming xml:parse()?
Would loading the files into a database directly be the approach, or is it better to split them into smaller files?
Is the file: module a suitable route through which to import the files?
Thanks for your help
Peter
_______________________________________________ BaseX-Talk mailing list BaseX-Talk@mailman.uni-konstanz.de https://mailman.uni-konstanz.de/mailman/listinfo/basex-talk
basex-talk@mailman.uni-konstanz.de