Dear all, thank you very much for sharing your knowledge and experiences! They are very valuable for me! I understand your observations on the qualitative aspects related to the different XML models for json representation. For me and my team, in general, it's always worth sticking to a standard when there is one available. Even if I understand that the one I mentioned is not really formal up to now, due the very small amount of Json we have to face in this specific use case, we probably will continue to follow this one. Dirk's observation on making the things smoother by including a possibility of declaring the "XSLT 3 standard" as a possible serialization format, in our opinion, would be anyway useful. Thanks a lot again for your always precious support! Regards (and nice weekend). Marco.
On 16/10/2015 12:21, Dirk Kirsten wrote:
Hello Marco,
I am also not sure about the rationale of your question. One of the reasons why we have multiple JSON representations/converters is that at the time of introducing this there wasn't any XQuery 3.1 spec, so obviously we couldn't implement this. There was stuff like JsonML around, which we do support, and which I personally feel the same about as the new xml-tojson specification by the XQuery spec: It's quite cumbersome and impractical to work with.
So we have introduced our default mapping, simply because we feel it is a good fit. And speaking from experience developing a rather large JavaScript (AngularJS) application, which uses XML as a RESTXQ backend, this feels rather natural and nice to work with. I didn't want to do the same thing with the JsonML or the XQuery spec mapping.
Of course, you might have a different use case and there are many legitimate (or just personal preferences, which is ok as well) reasons to chose a different representation, but that's why we have multiple options.
Regarding your wish for a cleaner service chaining I just want to mention that there is a json-to-xml function as well (so nbo need to use json:serialize), so the round trip should be rather logical. Maybe it would be possible to add the xquery spec mapping to our serialization options as well (given that it is already implemented in the functions itself, it should be at least theoretical possible to add it there as well), so handling would be more smoothly.
Cheers, Dirk
On 10/15/2015 11:04 PM, Hans-Juergen Rennau wrote:
Hi Marco,
I am not quite sure if I understand your question well, but let me make a couple of remarks about the rationale of treating JSON as XML.
XML is accessible to XPath, nested maps and arrays are not. Hence any non-trivial navigation of XML trees is incomparably simpler and more elegant than navigation of JSON trees, representated by nested maps and arrays. Similarly, the *construction* of complex XML structure is often much simpler than the construction of nested maps and arrays. As a consequence, the processing of JSON can often be radically simplified by dealing with XML, rather than JSON: just frame your processing between an initial JSON to XML transformation (json:parse) and a final XML to JSON transformation (json:serialize).
It is important to realize that there are alternative XML representations of JSON. The representation defined in the XSLT 3.0 spec is very "clean" from a conceptual point of view, but navigation-unfriendly, as the all important object member names are represented by attributes, rather than element names (which are restricted to "map", "array", "string", "number", "boolean", umph!), resulting in a rather cluttered representation, requiring clumsy XPath expressions. The alternative representation offered by BaseX [1], on the other hand, looks intuitive and enables navigation which is in typically cases as elegant as if the data were supplied in XML to begin with. I can highly recommend dealing with JSON via the BaseX XML representation, and I use this XML format also in the construction of JSON.
An advantage of the map/array representation is higher performance, which may become important especially when dealing with very large documents.
Wrapping up, it is very important to have access to both principal kinds of JSON representation in XQuery, maps/array and XML.
Kind regards, Hans-Jürgen
[1] http://docs.basex.org/wiki/JSON_Module#BaseX_Format
PS: If you are interested in a discussion comparing navigation of JSON represented by maps/arrays versus XML, you might consider reading here:
http://www.balisage.net/Proceedings/vol15/html/Rennau01/BalisageVol15-Rennau...
Marco Lettere marco.lettere@dedalus.eu schrieb am 21:54 Donnerstag, 15.Oktober 2015:
Hi all, I have a bit of difficulty in understanding the rationale behind the XML to Json (and vice-versa) conversions. I must admit I'm a bit confused on when to use which format and how those map to serialization/parsing rules. In particular I'm wondering whether it's really necessary to use both the formalisms (and functions) in the following example. My feelings are that using only the standard (?) XQuery 3.1 representation would be much clearer and would enable a cleaner service chaining mechanism. Has anyone 2 minutes left to explain to me the model behind this sort of things? I really appreciate your support! Regards, Marco.
module namespace jt = "urn:json:test";
declare %rest:path("jsonsrv") %output:method("text") %output:media-type("application/json") function jt:service() { xml-to-json( <map xmlns="http://www.w3.org/2005/xpath-functions"> <string key="a">12345678</string> </map>) };
declare %rest:path("testreq") %output:method("text") %output:media-type("application/json") function jt:test() { let $response := http:send-request( <http:request href="http://localhost:9984/jsonsrv" method="POST"> <http:body media-type="application/x-www-form-urlencoded" method="text"/> </http:request>) return json:serialize($response[2]) };
-- Dirk Kirsten, BaseX GmbH,http://basexgmbh.de |-- Firmensitz: Blarerstrasse 56, 78462 Konstanz |-- Registergericht Freiburg, HRB: 708285, Geschäftsführer: | Dr. Christian Grün, Dr. Alexander Holupirek, Michael Seiferle `-- Phone: 0049 7531 28 28 676, Fax: 0049 7531 20 05 22