I have written an XQuery that properly extracts binary(raw) archive resource contents to binary(raw) resources. However since archive:extract:-binary returns the contents as xs:base64Binary and not streamable items, large archive contents are loaded completely into memory and will often cause an OutOfMemoryException.
Right; see [1].
this is not acceptable in a system where many users may be doing this concurrently.
Well, I only partially agree. Similar to Java, and other full-fledged languages, it’s fairly easy to write code that cause out-of-memory errors, and – as far as I know – it’s generally hard to restrict memory usage to a single thread within the same JVM. We try to do so with xquery:eval [2], but there is no guarantee that it will always work out.
Christian
[1] https://www.mail-archive.com/basex-talk%40mailman.uni-konstanz.de/msg07078.h... [2] http://docs.basex.org/wiki/XQuery_Module#xquery:eval
archive:extract-to seems to only extract to
an external file-system location and not to binary(raw) resources. Example code is provided below that extracts to the same database folder as the archive, however the target location of the contents is not pertinent to the problem.
let $database := 'db' let $archivePath := 'myfolder/myarchive.zip' let $archive := db:retrieve($database, $archivePath) let $basePath := replace($archivePath, '[^/]+$', '') let $entries := archive:entries($archive) let $contents := archive:extract-binary($archive) for $entry at $pos in $entries let $content := $contents[$pos] let $target := $basePath || '/' || $entry/text() return db:store($database, $target, $content)