Did you use debugging (-d)?
Erol AkarsuI had to index xml file first and processed it fine.I think it is impossible to process so large XML file 9GM like this.Chrsitian,It does not show any stack trace.
But it shows only "Out Of Main Memor" exception in "Query Info" page
On Wed, Jun 18, 2014 at 8:02 PM, Christian Grün <christian.gruen@gmail.com> wrote:
How does the stack trace look like?
Am 18.06.2014 16:09 schrieb "Erol Akarsu" <eakarsu@gmail.com>:But today, it is out of memory exception.I have about 9G xml file and like to process with basex.But I am getting out of memory erroe (with setting -Xmx7G). Yesterday, I was able to process like this where $rawProductsFile is the name of big xml file. XML file consists of "Products" elements and process them one by one and write result into a file.
let $res :=
for $doc in fn:doc($rawProductsFile)//Products
let $list:=
for $tname in $tables
let $rows := ($doc[fn:name() eq $tname]|$doc//table[@name eq $tname])
let $list2 := local:processOneProd ($tname,$rows,$tableMap)
let $fw := file:append(fn:concat($tablesFolder,$tname,".xml"),$list2)
return $fw
return $listWhich way is good to process large xml file like this, process each element one be one?I appreciate your helpErol Akarsu