All docs pages in a single XML file probably would be a great source for demonstrating full text search. ;)
By the way, you can fetch all pages in export format (is there any reader for that but a local mediawiki instance?) -- by using XQuery, of course:
declare namespace xhtml = "http://www.w3.org/1999/xhtml"; declare namespace wiki = "http://www.mediawiki.org/xml/export-0.4/"; declare option db:parser "html";
let $pages := fn:serialize( doc("http://docs.basex.org/wiki/Special:AllPages%22)//xhtml:table%5B@class=%22mw-..., map { "method" := "text", "item-separator" := "\r\n" } ) let $request := <http:request href="http://docs.basex.org/index.php?title=Special:Export" method="post"> <http:body media-type="application/x-www-form-urlencoded" method="text"> <![CDATA[catname=&curonly=1&wpDownload=1&pages=]]>{ fn:encode-for-uri($pages) } </http:body> </http:request> return http:send-request($request)//wiki:mediawiki
That code interprets the "All Pages" special page and feeds that list into the "Export" page.
Regards from Lake Constance, Germany, Jens Erat
@Christian: We're definitely missing some HTTP-Post for form data example in the docs, `<http:body method="text"/>` was really hard to find.