Hello there,
I am trying to parse a large CSV file into XML by the following xquery:
let $file := fetch:text("D:\BPLAN\tlk.txt") let $convert := csv:parse($file, map { 'header' : true(), 'separator' : 'tab'}) return fn:put( <tlks> {for $row in $convert/csv/record return <tlk>{$row/*}</tlk> }</tlks>, "D:\BPLAN\tlk.xml" )
Using the GUI, it runs out of memory -- when I click on the bottom right hand corner (where the memory usage is shown), it says to increase memory, restart using -Xmx <size>.
I do this through the MS DOS prompt, but -Xmx does not appear to be a parameter any more,
Is there a better method for parsing large CSV files? I then want to add the resulting file tlk.xml to a new database.
Kindest Regards Shaun Connelly-Flynn
Hi Shaun,
I do this through the MS DOS prompt, but -Xmx does not appear to be a parameter any more,
If you work with the ZIP distributions of BaseX, you can adjust the memory setting in the start scripts (in the bin directory). Otherwise, you’ll need to pass it to to java, not BaseX itself.
Is there a better method for parsing large CSV files? I then want to add the resulting file tlk.xml to a new database.
Did you check if it’s the CSV parsing or the fn:put call that causes the error? Christian
basex-talk@mailman.uni-konstanz.de