Hi Alexander,
how does your XQuery/BaseX script look like? If you use the XQuery
doc() function, you could try to replace it with
parse-xml(fetch:text(...)), because the latter approach will close
your documents and free memory if the processed document is not
required anymore.
Best,
Christian
___________________________
2013/6/24 Alexander von Bernuth
<
alexander.von-bernuth@student.uni-tuebingen.de>:
Hello all,
my basex-script should fetch 10.000something XML-files automatically from a
website and insert their content into a external PostgreSQL-database. After
about 8.000 files my script stops and I get "Out of Main Memory".
I found your discussion with "kgfhjjgrn" [1] regarding this issue, but I'm
not sure whether these options apply to my problem - I do not build a
basex-database but an external one. Will autoflush=false and flushing by
myself help with this?
Second, I want to insert some xs:base64Binary into my PostgreSQL database,
but I cannot find the correct sql:parameter type for the bytea-column.
Could you please help me with my issues?
Thank you very much,
Alexander
[1] http://comments.gmane.org/gmane.text.xml.basex.talk/2540
--
| Alexander von Bernuth
| alexander.von-bernuth@student.uni-tuebingen.de
_______________________________________________
BaseX-Talk mailing list
BaseX-Talk@mailman.uni-konstanz.de
https://mailman.uni-konstanz.de/mailman/listinfo/basex-talk