Hi,
Is there a way to bulk export the basex docs wiki? Or, if it is backed by a db, get a dump of it?
Travelling offline and would love to have access (the built-in pdf creator seems nice but not for getting *everything*)
Many thanks, Colin
You may want to have a look at the PDF versions of our Wiki:
http://docs.basex.org/wiki/Documentation
If anyone out there has some ideas how to make PDF generation easier, we’d be glad for your feedback! ___________________________
Is there a way to bulk export the basex docs wiki? Or, if it is backed by a db, get a dump of it?
Travelling offline and would love to have access (the built-in pdf creator seems nice but not for getting *everything*)
Many thanks, Colin
BaseX-Talk mailing list BaseX-Talk@mailman.uni-konstanz.de https://mailman.uni-konstanz.de/mailman/listinfo/basex-talk
Thanks!
I see that now on the homepage. Can I suggest that this link be repeated under "print/export" in the left-hand menu?
And just out of curiousity, is there a way to get as data (markupl, csv, etc.) ?
On Tue, Feb 19, 2013 at 2:59 PM, Christian Grün christian.gruen@gmail.comwrote:
You may want to have a look at the PDF versions of our Wiki:
http://docs.basex.org/wiki/Documentation
If anyone out there has some ideas how to make PDF generation easier, we’d be glad for your feedback! ___________________________
Is there a way to bulk export the basex docs wiki? Or, if it is backed
by a
db, get a dump of it?
Travelling offline and would love to have access (the built-in pdf
creator
seems nice but not for getting *everything*)
Many thanks, Colin
BaseX-Talk mailing list BaseX-Talk@mailman.uni-konstanz.de https://mailman.uni-konstanz.de/mailman/listinfo/basex-talk
I see that now on the homepage. Can I suggest that this link be repeated under "print/export" in the left-hand menu?
Ok, done.
And just out of curiousity, is there a way to get as data (markupl, csv, etc.) ?
If you find a way, I’d be interested as well ;) The MediaWiki homepage might be a good starting point [1].
Christian
It looks like recent versions of MediaWiki have an api enabled by default - if you have a login you can use it to query the docs, etc.
I've just applied for one for docs.basex - if I find anything really cool I'll post back.
Thanks again, Colin
On Tue, Feb 19, 2013 at 3:11 PM, Christian Grün christian.gruen@gmail.comwrote:
I see that now on the homepage. Can I suggest that this link be repeated under "print/export" in the left-hand menu?
Ok, done.
And just out of curiousity, is there a way to get as data (markupl, csv, etc.) ?
If you find a way, I’d be interested as well ;) The MediaWiki homepage might be a good starting point [1].
Christian
..thanks for investigating! ___________________________
On Tue, Feb 19, 2013 at 10:33 PM, Colin McEnearney colinmcenearney@gmail.com wrote:
It looks like recent versions of MediaWiki have an api enabled by default - if you have a login you can use it to query the docs, etc.
I've just applied for one for docs.basex - if I find anything really cool I'll post back.
Thanks again, Colin
On Tue, Feb 19, 2013 at 3:11 PM, Christian Grün christian.gruen@gmail.com wrote:
I see that now on the homepage. Can I suggest that this link be repeated under "print/export" in the left-hand menu?
Ok, done.
And just out of curiousity, is there a way to get as data (markupl, csv, etc.) ?
If you find a way, I’d be interested as well ;) The MediaWiki homepage might be a good starting point [1].
Christian
All docs pages in a single XML file probably would be a great source for demonstrating full text search. ;)
By the way, you can fetch all pages in export format (is there any reader for that but a local mediawiki instance?) -- by using XQuery, of course:
declare namespace xhtml = "http://www.w3.org/1999/xhtml"; declare namespace wiki = "http://www.mediawiki.org/xml/export-0.4/"; declare option db:parser "html";
let $pages := fn:serialize( doc("http://docs.basex.org/wiki/Special:AllPages%22)//xhtml:table%5B@class=%22mw-..., map { "method" := "text", "item-separator" := "\r\n" } ) let $request := <http:request href="http://docs.basex.org/index.php?title=Special:Export" method="post"> <http:body media-type="application/x-www-form-urlencoded" method="text"> <![CDATA[catname=&curonly=1&wpDownload=1&pages=]]>{ fn:encode-for-uri($pages) } </http:body> </http:request> return http:send-request($request)//wiki:mediawiki
That code interprets the "All Pages" special page and feeds that list into the "Export" page.
Regards from Lake Constance, Germany, Jens Erat
@Christian: We're definitely missing some HTTP-Post for form data example in the docs, `<http:body method="text"/>` was really hard to find.
On 02/20/2013 12:40 AM, Jens Erat wrote:
All docs pages in a single XML file probably would be a great source for demonstrating full text search. ;)
By the way, you can fetch all pages in export format (is there any reader for that but a local mediawiki instance?) -- by using XQuery, of course:
declare namespace xhtml = "http://www.w3.org/1999/xhtml"; declare namespace wiki = "http://www.mediawiki.org/xml/export-0.4/"; declare option db:parser "html"; let $pages := fn:serialize( doc("http://docs.basex.org/wiki/Special:AllPages")//xhtml:table[@class="mw-allpages-table-chunk"]//xhtml:a, map { "method" := "text", "item-separator" := "\r\n" } ) let $request := <http:request href="http://docs.basex.org/index.php?title=Special:Export" method="post"> <http:body media-type="application/x-www-form-urlencoded" method="text"> <![CDATA[catname=&curonly=1&wpDownload=1&pages=]]>{ fn:encode-for-uri($pages) } </http:body> </http:request> return http:send-request($request)//wiki:mediawiki
That code interprets the "All Pages" special page and feeds that list into the "Export" page.
Regards from Lake Constance, Germany, Jens Erat
@Christian: We're definitely missing some HTTP-Post for form data example in the docs, `<http:body method="text"/>` was really hard to find.
Yeah! I also fought half a day to make it work switching back and forth between BaseX and Expath! :-D M.
basex-talk@mailman.uni-konstanz.de