My advice on quitting smoking — get someone to lock you in a room where you can’t get a cigarette for a solid week. That’s the hardest part of quitting, getting over the physical addiction.#

109 responses to this post.

  1. Posted by PXLated on December 20, 2006 at 9:15 am

    NO…
    I support the “troops” even though I don’t support the war as it is.
    As a Vietnam vet, I remember the protests and still depise some of those involved. There was a time in that war when it wasn’t considered “bad”. It became that while I was there, a complete change from when I went to when I returned. There are a lot of troups that joined before this war and a just stuck. Don’t demean them with protests.
    I think there’s a better way, gang up on the politicians en-mass with calls, email, snailmail. Make it personal, one-on-one with the clowns that went along with this. Focus the heat where it belongs.

    Reply

  2. How come more sons and daughters from NorCal and Silicon Valley aren’t serving?’
    Just wonering,

    Jim Forbes

    P.S. as a Southern Californian i want to know how come you hid my water under your dirt?

    Reply

  3. The idea behind JSON is that it can be parsed with one line of JavaScript. The client must have full trust in the server, because the JSON is actually executable code that will be evaluated by the JavaScript engine. It’s kind of nonsensical to use it for non-JavaScript clients.

    Reply

  4. No, it’s not XML — that’s why it is catching on. In the AJAX biosphere all parsing is handled by Javascript (okay, ECMAScript, but who really calls it that?) and parsing through a ton of XML cruft in order to get to data when all you need is a single integer or string or array wastes time and adds needless complexity. If you no what you’re getting there’s no need for formatting or schema, just send the data.

    Reply

  5. Dan, where are the benchmarks that say that on a processor capable of running Flash apps or playing Youtube videos that it can’t parse XML “cruft.” If you’re going to use an engineering argument, be prepared to back up t your assumptions with data. I’m pretty sure you can’t because I ran my own benchmarks on the end-user CPU of the late 90s and the overhead of parsing XML was negligable then, and now..? Geez Louise. How many gigahertz to people have these days??

    Reply

  6. I haven’t looked at JSON properly yet myself, but some info here suggests it’s ‘The Fat Free Alternative To XML’ – http://www.json.org/xml.html

    hmm.. the jury’s out on that for me still.

    Reply

  7. OK but why would delicious want to limit their API to JavaScript? Just to save on some pretty minor XML overhead? Are there Perl, PHP, Python and/or Ruby modules to parse JSON? Wouldn’t this API be useful on other platforms?

    I could see the reason to use JSON if the utility was only of use on the client side but this could be useful to someone using any of several interpreted languages.

    Maybe delicious already has a similar API for those people?

    Reply

  8. I can certainly see the the resultset with JSON looks to be like a format that JavaScript and ActionScript could handle natively, as Arrays, as opposed to loading up various XML parsing libraries.

    PHP5 does have great XML functions these days (though could be better still) – as well as JSON so, I suppose the question is: which is faster/smaller? The JSON or XML libraries?

    There’s clearly less chardata in the JSON results, and also multi-dimensional arrays are cerebrally very much like XML structs 😉

    Hmmm..

    I heart xml though – it IS human readable, if you ask me (until you ‘prettify/obfuscate’ it) ! ;p

    Reply

  9. One of the great things about the PHP site is all the user contributions: Here, someone has provided a nice script to help make JSON data more ‘human readable’

    http://uk2.php.net/manual/en/function.json-encode.php#71378

    Oh yes – I heart PHP too!

    Reply

  10. Dave – there’s JSON libraries for most languages already built, and it looks like the typical instance of same is much smaller than an equivalent turing-complete XML parser.

    The big downside of JSON as a strategy is that you’re sending over executable code, not just data, and so if some security flaw happens in the system at the server side and the client is lazy or stupid there can be repercussions. Would you exec code that came from a random server?

    Reply

  11. Ryan’s point is the best one: why limit based on the assumption of a JavaScript client.

    JSON is easier to process than XML within a JavaScript client application. I wouldn’t say it’s orders of magnitude easier, but easier.

    However, the purpose behind the delicious decision is the ability to break out of the javascript sandbox, and make a service request to a domain that’s not the same domain that issued the page.

    This is a big thing for Yahoo!, and you can see the writeup on this from the Yahoo! UI site:

    “Callbacks are particularly useful for use with web service requests in client-side JavaScript. Normally web service requests using the XMLHttpRequest object run afoul of browser security restrictions that prevent files from being loaded across domains. This restriction requires you to proxy your requests on the server side or use server rewrites to trick the browser into thinking the web service data is coming from the same site as your web content.

    Using JSON and callbacks, you can place the Yahoo! Web Service request inside a tag, and operate on the results with a function elsewhere in the JavaScript code on the page. Using this mechanism, the JSON output from the Yahoo! Web Services request is loaded when the enclosing web page is loaded. No proxy or server trickery is required.”

    delicious has already said that this functionality will, most likely, have to wrapped in specific widget format, or as Niall wrote, …may be shut down if not used in full Yahoo-constructed blog sidebar badge form.

    This prevents something like Tailrank from integrating the service into its service, and makes it really viable for individual users.

    And that, Dave, in a nut shell, is the reason why delicious is providing a JSON format.

    Reply

  12. Chris, there is a regular expression check for well-formedness that you should run before you pass the JSON into eval(). That takes care of most of the security implications, at least the most heinous ones.

    Dave, I don’t know of any studies of the speed difference, but if you consider that the parsing of Javascript (and therefore JSON) is going to happen in the browser, and will be highly-optimized and probably in native code, whereas the parsing of XML would have to happen in Javascript, and therefore in a VM in the browser, surely you can see that JSON will be quite a bit faster. All the more so since it doesn’t need to keep track of element names, etc.

    As well, you’ll notice that YouTube uses flash to play their videos, and the Flash player isn’t implemented in Javascript, but instead as a C/C++/whatever plugin that runs native code. Your comparison is really apples to oranges for that reason.

    Finally, JSON is typed, whereas in XML, everything is a string. Being typed is an advantage for solving certain classes of problems.

    Ryan, there are modules to parse JSON for Perl, PHP, Python, Ruby, C, C++, C#, Delphi, Erlang, Haskell, Java, Lisp, Lua, Objective C, Rebol, and Squeak. And a few other obscure languages. 😉 (And even if there weren’t, the spec is simple enough that I’m pretty sure I could write one for a new language in under a week, including the time to learn enough of the language to write it. Heck, maybe I’ll write one for OCaml.)

    Reply

  13. The nice thing about JSON is that you *don’t* have to parse it on the client side. Just save the returned data as a JavaScript variable and your data object is ready to go.

    Reply

  14. Sorry, it doesn’t ‘prevent’ the other services from grabbing the data (XML/JSON, server applications can work with both)–but it wasn’t meant to be used by these types of services, and the output was formatted specifically for JavaScript/widget use.

    XML cannot provide this functionality. JSON allows us to break out of the browser sandbox.

    Whether this is a good idea, is another good topic.

    Reply

  15. Blake, both SOAP and XML-RPC serialization handle typing.

    Reply

  16. Fair enough. I forgot you weren’t talking about plain old XML for a second there…

    My other points still stand, I believe.

    Reply

  17. Posted by Mike Parsons on December 20, 2006 at 12:01 pm

    Hey Dave,

    I think you are missing the point with JSON, XML and other data formats. The point is that from and end-users (developer, etc) perspective, there is no such thng as one universal data format. It’s nice to be able to get the data in EXACTLY the format you want. JSON is just another choice. Assuming that XML is the only format that makes sense is naive especially when you look at all the existing applications out there that expect data in a multitude of data formats.

    Give the consumer the “choice” of how they want data, instead of imposing a specific data format on them. After all, you wouldn’t want anyone to take away your ability to choose, would you?

    Reply

  18. It’s more lightweight, as kosso pointed out, and it’s native-readable in javascript (which is important, because javascript is the only language that would get a chance to parse the data in an ajax application). It’s true that parsing XML is negligible, but you still have to do coding work to get it parsed, and you incur those same costs to encode it if you ever send anything back to the server. All you have to do is eval(jsonstring) and you have a working javascript object/datastructure/whatever other JS closure you feel like creating. We’ve found it to be easier than messing with a DOM and more flexible than parsing a string for browser/server communication.

    That said, I think it has no extra value as a server-to-server kind of communication (unless message size is *really* important to you), and I don’t think it was meant to. It’s just an easy, flexible way to communicate with a JS-enabled app.

    Reply

  19. Wow a lot go said as I was typing that :-/ sorry if I duplicated any thoughts.

    Reply

  20. I wish del.icio.us had used XML as well as JSON, but it’s not too big a deal for me. I’ve been decoding JSON today (see my blog) with some success. If you are using PHP 5, then json_encode and json_decode are native and extremely quick. The JSON (json.org) website lists a lot of libraries for lots of languages.

    Using json_decode, PHP churns out an associative array. From that, it only took me seven lines of code to turn the data I wanted in to XML (RDF in my case). JSON sucks, because it’s another layer of complexity. But it’s not something I get too stressed over.

    I’d agree with you Dave about the lynchings, but they don’t tend to work too well. A little town in Massachusetts call Salem proved that.

    Reply

  21. Dave seems to me to be a bit off on his understanding of the purposes of SOAP and JSON. SOAP is an amorphous pseudo standard that keeps changing without being properly updated anywhere. SOAP is used how Microsoft says it should be used…this quarter. JSON really has nothing to do with RPC, it’s much more Resource-oriented. JSON actually delivers on the acronym of SOAP; JSON is Simple Object Access. It’s accomplished as a serialization format, not an RPC kludge. JSON is _only_ the serialized object, not the transport layer.

    XML is a great solution for data interchange to be sure and I don’t think that public APIs that interchange documents should use anything but real XML. That being said JSON still has quite a lot of usefulness in web programming. JSON objects are easily created (though rarely RFC-4627 compliant) and if one wanted an XSL transform could take a POX and make it into a JSON object.

    But the real issue for why people use JSON over XML is that XML tools still sorta stink for doing something small. Manipulating the DOM is memory intensive and using SAX and XSLT scares most people who don’t like recursion and the like. JSON allows for really simple and powerful object creation with a minimal learning curve. To use JSON one doesn’t have to know about XML, namespaces, event handlers, or anything other than simple objects.

    Reply

  22. LOL @ Tom – I was just doing the same thing! 😉

    Hey this discussion has made me learn a few things about JSON – thanks !

    Especially the crossdomain/browser sandbox issue – one I have have come across many times with Flash and ActionScript.

    Another thing about JSON I read, is that all data is considered to be UTF-8, which is good news for international data on your systems. I have recently had to deal with a slew of poorly/formed poorly/encoded data which renders differently in different browsers. I hope this might help.

    I agree though that a system like del.icio.us should support both. It’s like support different versions of RSS or OPML – do both – it’s easy!

    Reply

  23. Meh. JSON’s an option – if you’re operating in Javascript then it makes sense to cut the processing time out of the process. I’m not aware of any decent service that’s *only* offering JSON – that would be silly.

    [Not that anyone visiting blogs chock-loaded with these ‘widgets’ I keep hearing about would oppose some efficiency gains in the use of Javascript. This morning I watched a slew of sites take half a minute each to render while some POS from Twitter held everything up. Other days it’s Technorati, or some ads, or whatever. Mumble, grumble.]

    Reply

  24. Kosso, that’s not exactly correct…

    http://www.ietf.org/rfc/rfc4627.txt says that all characters in strings (including the keys for objects/hashes) will be Unicode, and then it gets kind of clever…

    From the spec:
    JSON text SHALL be encoded in Unicode. The default encoding is
    UTF-8.

    Since the first two characters of a JSON text will always be ASCII
    characters [RFC0020], it is possible to determine whether an octet
    stream is UTF-8, UTF-16 (BE or LE), or UTF-32 (BE or LE) by looking
    at the pattern of nulls in the first four octets.

    00 00 00 xx UTF-32BE
    00 xx 00 xx UTF-16BE
    xx 00 00 00 UTF-32LE
    xx 00 xx 00 UTF-16LE
    xx xx xx xx UTF-8

    Reply

  25. Dave, JSON has it’s uses and pitfalls just as XML has it’s uses and pitfalls. What I don’t like is when when fanatics get blinded and start comparing apples and oranges. Heck, they are just fruits. Eat what you feel like.

    Re Delicious, I think they should add an XML API if the the demand is there.

    Reply

  26. i used your original xml-rpc; its was beautiful. worked, and easy to use. I never got into soap because it looked awful and seemed overly complicated compared to xml-rpc and was driven by Microsoft…and none of the web-apps supported it, like weblogic when i used them.

    as for JSON…its worse. I’ll never use it unless I’m forced to…but I can write my own protocol in 5 minutes.

    I’m not impressed.

    Reply

  27. Shelley,

    Interesting. Do I understand correctly that (asking sincerely):

    JSON code/data can be downloaded from a foreign server, via SCRIPT tag’s SRC element, as compared with XML over XMLHttpRequest which famously cannot?

    I suppose this must be true. The Google Maps API works this way, essentially.

    Shelley’s point is important. The ‘efficiency’ argument is a red herring.

    However it does raise an important question that Shelley hinted at in a later comment: If it is so dangerous to download an XML string from a foreign server that the browser forbids it by policy, is it safe to download executable JavaScript from a foreign server?

    If it is safe to download and execute JavaScript from a foreign server, maybe the default safeguards in XMLHttpRequest should be loosened? And if it is not safe, should SCRIPT SRC be forbidden?

    Reply

  28. PS the other interesting thing Shelley said, of course, is that delicious does not WANT this to be used anywhere but in the browser, and in a specific sort of Yahoo widget. If this is the case, why is anyone calling it an API? Where’s the ‘P’?

    Reply

  29. so JSON is only for JavaScript programmers. I just read the responses above…so, why does JSON just return the serialized stuff for Javascript…why can’t it just return Javascript itself, and run?

    Reply

  30. First of all, you have to keep in mind that JSON was not invented, it was discovered. Nobody sat down and set “Let’s re-invent XMLRPC in JavaScript!” It was simply that developers working with JavaScript realized that the language itself had a nice shorthand syntax for encoding generic data structures, and ran with it.

    Even if you don’t buy the performance argument, it’s just much more natural and less verbose to work with a hierarchy of JavaScript objects and arrays than to crawl the XML DOM. Even if the time to call parseInt() or object.getAttribute(“foo”) is negligible, I would much rather have my integers as integers from the beginning, or write “object.foo”.

    So, at the end of the day, it comes down to a question of usability. JSON is just much more usable than XML if you’re writing JavaScript.

    Reply

  31. I ran some benchmarks on “JSON” before it had a name.

    Using Mozilla as my browser, I was returning an XML structure that had maybe 10KB of data in it. Trying to process it froze the browser solid for four tenths of a second while the XML was parsing and being broken out into the DOM structure Mozilla uses, after which I had a DOM structure, not the structure I was actually looking for in my Javascript app.

    Returning a Javascript object and running “eval” on it returned the structure in .004 seconds, one hundred times faster.

    The ratio has changed a bit in XML’s favor, but the browser XML parsers are fundamentally trying to do a lot more with XML than with Javascript objects, and for better or worse, the performance hit gets into “user-noticable” pretty quickly.

    If you need to schlep a lot of data quickly onto the client side of a browser, you really need “JSON”. (I actually prefer breaking out the full name, “javascript object notation”, because it’s not really an acronym-able technology, just matching the grammar of a language so you can tap a fast parser.)

    Using JSON for anything else is pretty silly, but since it’s there, it’ll get used. If there had been better XML support in the browsers (notably a SAX-style event parser instead of the heaviest-possible DOM parser), it might not have happened.

    This is the sort of thing that really reminds you that browsers are not very powerful development environments.

    Reply

  32. I just want to also add that I think it’s wrong to consider JSON a replacement for XMLRPC. If I were publishing a web service API, I would offer both. JSON is simply a pragmatic concession to the growing JavaScript/Ajax community, but it doesn’t make much sense to me to write a JSON parser for other languages.

    It would also be dangerous to do so, since JavaScript is one of the few languages that most commonly run sandboxed, so it is safe to directly eval() an incoming data packet. I wouldn’t want to eval() a Python or Perl data structure on my web server like that.

    Reply

  33. Thanks Blake. It was specific to a PHP specific JSON function I think.
    I’m learning about this too, having never used (avoided) it before.

    🙂

    Reply

  34. Counter to what almost everyone here has said, JSON isn’t just for JavaScript. I frequently use it to move data from PHP applications to Python applications – and it’s by far the best tool for that job. If you have structured data in one place (by structured I mean strings, associative arrays and lists) and want to move it somewhere else you need some kind of cross-platform serialization format. XML isn’t one – you have to invent your own format using XML first. XML-RPC and SOAP almost solve this problem but come with extra baggage – they are full-blown RPC mechanisms with transports, when I just need a serialization format. JSON hits a really nice sweet spot – it’s simple, human readable, efficient and has libraries in all of the languages I care about.

    XML is a great tool for some jobs, but in my experience sending a simple list of strings from one place to another is a lot less painful with JSON.

    Reply

  35. Dave,
    I respect that you helped create syndication and Xml-RPC. I agree that the SOAP stack has gotten crazy, but you are completely wrong about JSON.
    1) JSON is thinner than Xml (slightly, no big deal really)
    2) JSON is native to the client. Having to load a specialized library to transform my data SUX.
    3) JSON can do cross domain calls that XmlHttp requests can not
    4) It is easier to encode / escape in JSON than XML
    5) JSON can define a function and data in one pass
    6) JSON can be used as a PSUH from the server (ok – it really is a managed pull, but at least it is not POLLING)

    Me thinks you are trying to revitalize Xml-RPC, which was killed off with the hype of SOAP. XmlRpc is kewl, but we have learned since then: we need a schema, a descriptor of services, binding, transactions, etc. Some of the recent work with JSON gives us these. Are they bound just to this ‘new’ technology? Of course not – but let us not launch a holy war just further your EGO.

    Reply

  36. Dave, I would have expected you to be one of the first people to understand an appreciate the strengths of JSON, because they are so practical and simple.

    One huge practical advantage to JSON over XML in the web browser is that you can load JavaScript from any site, while you’re restricted in which sites you can load XML from. I have no idea why browsers insist on limiting where you can load non-executable XML from, when they don’t bother to limit where you can load executable JavaScript from, but there you go: that’s how it is, and there’s nothing anybody can do about it. JSON nicely works around that limitation, instead of holding its breath and waiting until the world comes around to being fair. Surely you can appreciate that practical advantage.

    Another practical advantage is that it JSON is the absolutely fastest way to convert text data into a usable native format. Even if the XML parser could be as fastas the JavaScript parser, it parses the text into many fluffy XML DOM nodes (possibly including text nodes for intermediate white space you have to skip over), which is yet another layer your code has to go through to get any useful work done. Instead, JSON parsing DIRECTLY results in tight JavaScript data, which is the most optimal format for processing with JavaScript, with no wasteful layers of indirection and generic DOM apis that require lots of effort for extraction and type conversion. XPath expressions and DOM access are NEVER going to be as fast as direct array access in JavaScript. JSON simply has fewer layers than XML, so it’s way faster and uses much less memory, and it’s easier to use because you hve direct access built into the language, instead accessing data through a DOM api.

    The other advantage is that it’s a more-or-less universal lowest common denominator between languages, high enough to be useful, but not too low. XML has many features than most languages don’t support and many applications don’t need, because they’re really just working with arrays and dictionaries, and don’t need all the features of XML.

    JSON makes it possible to use a bunch of different languages together easily. And there’s no way to get around the fact that JavaScript totally rules in the web browser, but has a lot of quirky weaknesses that must be worked around. And JSON serves both purposes quite practically (interoperating with other languages, while taking advantage of JavaScript peculiarities and working around its limitations). And it’s that practical approach that I would have expected you to appreciate.

    -Don

    Reply

  37. Ryan, in this case, yes I don’t think efficiency was the reason that delicious made this decision. From what I saw of the video and what Niall said, it sounds like they want this service to be part of widgets, not necessarily consumable by other services. Hard to say until they release the documentation.

    I actually would prefer things stay they way they are in browserland. We use XHR for our web application, serving up our own data from our own servers, in an environment we, more or less, can control.

    However, it’s a pain to do a proxy, and it’s prohibitive for widgets. We can’t ask people to install PHP on their servers just to run a widget. And it’s not feasible for the Typepad and WordPress.com and blogger webloggers.

    I was able to get a running example against delicious using Niall’s URL in less than 2 minutes. (Thank you Firebug, I love you.) That’s creating the page, writing the script, copying and pasting the url, … I could do so because I can easily ‘read’ the structure of the data. More than that, it’s so easy! Perfect for widgets.

    Now, is this safe? That’s where it gets more interesting, especially becaue the people most likely to use widgets are the ones least likely to be able to understand how they work. Can we trust delicious? I’m assuming so. Can we trust that some hack can’t occur that will cause delicious to ‘publish’ something that will end up being a distributed launchpad for the next worst virus? That’s what we’ll have to see.

    But I must admit, it is very simple to code widgets and that’s only thanks to JSON.

    Reply

  38. If someone wants to offer server side data over JSON it is pretty trivial to offer the same data over XML-RPC. According to shelley and her quote of Niall there is a specific reason for not offering it over XML-RPC, which is to lock out people who want to use the data outside of a narrowly proscribed use (not judging just describing).

    So arguing the efficiency angle is beside the point. No?

    Reply

  39. I’m a .NET (WinForms mostly) guy, so this is the first time I’ve seen JSON.

    I have written AJAX-like apps in a past life, and at the time I remember thinking “why the hell are we passing heavy XML around when we could just as easily pass around any sort of delinieated string?!”. It looks like JSON is the best of both worlds: lightweight and well-formed.

    The gigahertz and bandwidth arguments are pretty lame, especially coming from an old-school guy like you Dave. Surely you must agree that efficient processing and bandwidth use is a Good Thing, even when you have plenty of both to burn? Hell given the cellular data rates in New Zealand I’d pick JSON over XML every time if I was writing a mobile data consumer.

    Reply

  40. PS

    s/offer the same data/ALSO offer the same data/

    Reply

  41. Per Jeremy, JSON _is_ an order of magnitude faster than XML. Two orders it sounds like. I stand corrected on this one.

    Reply

  42. Makes sense, though, doesn’t it Ryan? That this service makes more sense from a widget perspective than a web service one?

    And you’re right, in this instance, the efficiency is not the issue as the format and usage.

    It is not replacing XML-RPC or SOAP. It is ‘other’.

    Reply

  43. Just to be fair, there are some wrinkles in JSON’s universality.

    Here is a possible weakness with PHP processing JSON: PHP arrays are the same as PHP dictionaries, so how does PHP know how to convert an empty array/dict to JSON, so that it can survive a round trip without changing type?

    If I read in the json [] to PHP, then write it back out, how does PHP know to write out {} or [], since they are the same to PHP, which has no separate dict and array types? The same problem applies to dictionaries whose keys just happen to be consecutive integers, but the zero length array and dictionary are a more common example.

    Another wrinkle in JSON’s universality is that the Flash player does not have a built-in JavaScript parser, so in Flash’s version of JavaScript, JSON does not have the advantage that it’s trivially easy and efficient to parse. (People have written JSON parsers in Flash of course, but they’re nowhere near as fast as the built-in XML parser.)

    So in Flash, XML is still the preferred format, but it still uses a lot of memory: The Flash XML parser parses XML into JavaScript data structures that are more complex than JSON (twice the number of layers, three times the number of structures. An XML node gets parsed into an object with an attributes sub-dictionary and a contents sub-array).

    The absolute most efficient way to load data into Flash is by loading and executing compiled binary SWF files, which can contain executable byte codes and binary literals that directly creates the data structures you want to send.

    If you want to minimize the amount of data sent to the Flash client, the server can encode XML (or JSON) as a SWF file, and send it as compressed binary data.

    The OpenLaszlo’s server can compile XML and JSON into SWF, because years ago Flash’s XML parser was much slower, so it was worth parsing the XML on the server side (plus the added advantage that the server could proxy XML from any domain). But now Flash’s XML parser is much better, so it’s more efficient to download XML text to parse in Flash.

    However Flash still has its own restrictions on which domains you can download SWF or XML from, so Flash can’t take advantage of browser-based JavaScript’s ability to load JSON from any domain. So because of Flash’s restrictions (and browser bugs like IE6’s refusal to deliver compressed http content to plug-ins), you sometimes end up having to write a server-side proxy anyway, instead of downloading XML directly into the Flash player.

    -Don

    Reply

  44. Shelley thanks for elaborating, I just saw your comment after writing my last two posts, we were posting at the same time. I can definitely see the practical utilit of JSON.

    However Dave’s question makes sense too and could be rewritten as “Why not ALSO offer XML-RPC interface” for people on the server side who don’t want to mess with JSON (even though there are libraries, good as it is to know there are) as Yet Another Serialization format.

    It sounds like the answer is maybe “It’s a narrow API” but it’s unclear at the moment.

    Reply

  45. Hi,

    The reason we used JSON for this is because it is necessary for inclusion within a html badge scriptlet. Characterizing this as an API is an overstatement and/or miscommunication.

    Reply

  46. I’ve had all my comments deleted.

    Reply

  47. Oops, no there they are. Odd.

    Reply

  48. Yahoo! does with their’s, Ryan. You pass in output=JSON to get JSON, but you can also choose other formats.

    The person behind Tailrank wrote a comment at Niall’s which gives the impression that delicious wants to put tight controls over it’s data services. Hard to say, though.

    Reply

  49. I liked the part where Dr Phil said “What were they thinking?” I asked the same question when I first saw XML being proposed as a data format. There were obviously better alternatives.

    The good thing about reinventing the wheel is that you can get a round one.

    Reply

  50. Don: Take a look at the json_decode() function over at php.net – when you run it you have to specify whether or not you want an object or an associative array (you specify the latter by sticking a boolean true as the second parameter on json_decode). json_encode() takes either an object or an array and plonks out the relevant JSON for you. You can do a var_dump() and/or print_r() to verify this on PHP 5.2.0.

    Reply

  51. Posted by pwb on December 20, 2006 at 2:29 pm

    “I have no idea why browsers insist on limiting where you can load non-executable XML from”

    The security issue with cross-domain XMLHttpRequest is that I could serve a page to you which would, for example, be able to grab pages from your intranet (behind your firewall) and return them back to me. I’m not sure if the same is possible with JSON.

    Reply

  52. Ryan, I also wanted to say that we’re talking apples and oranges, too.

    We’re talking JSON as compared to XML-RPC and SOAP. We should be discussing the relative merits of XML and JSON, or REST, XML-RPC, and SOAP. The former are formats, the latter are service protocols.

    Also another point: we can easily convert between one well understood and defined format to another. The RDF folks have moved away from RDF/XML to Turtle, which is very JSON like in that it implements a format that supports a specific model, rather than serializing the data in a ‘foreign’ structure.

    It’s all interchangeable. However, it’s not all interchangeable if you’re using XML-RPC and SOAP.

    Reply

  53. Ah, thanks. I’m not familiar with JSON and it sounds like (?) one can send code along with data, but at heart I guess it’s a format not a protocol? In fact using SCRIPT SRC one must send code not data I would think, but point taken.

    Reply

  54. PS I’m not sure I follow your last sentence though.

    Reply

  55. @Don : Wow! Thanks for making this comment thread by far one of the most useful and educational I have read in a long time. Your ability to explain it clearly is phenomenal. 🙂

    I’m in the middle of throwing around alot of little bits of data around web pages and between flash in pages via JavaScript and Flash shared objects as well as ajax and php-parsed xml.

    I am certainly going to be looking at how JSON can help the system I’m building – though I will say the power of OPML inclusion has reeeeally helped alot thusfar.

    In my Flash files I will still use XML, as I also want to make that available to all the aggregators and clients out there who can read it fairly well. But what I will do is get our forthcoming api to output JSON too. Now my Javascript is brushing up, I will delve deeper into widgets. 😉

    For older Flash versions 4/5 – those also still only available on mobile platforms – I have always had to string results together and cook up some multi delimiters to create ‘pseudo’ arrays which ‘old’ Flash ActionScript could deal with.

    Now I realise that with different delimiters and a little tweaking, it almost WAS JSON! 😉

    I can safely say now – I get it! Thanks JSON!

    Reply

  56. Look at the bottom of json.org for links to parsing and generating libs in many prog langs.

    It’s not just for JS– it’s a data serialization format that happens to also be valid JS object literal notation.

    Yeah, everyone’s pulling their puds, and stupider than Dave.

    Yep.

    Reply

  57. Here’s what Joshua Schachter (founder) says about this so-called JSON API for del.icio.us:

    “This is just a simple JSON endpoint for a badge scriptlet we are going to release. Calling it an API is a overstatement and/or miscommunication. We may change formats, endpoints, etc, all the usual disclaimers apply.”

    The comment was over at Niall’s site. http://www.niallkennedy.com/blog/archives/2006/12/delicious-url-api.html

    So there is no reason for this argument. JSON does fine for widgets and that’s exactly what del is using it for.

    Reply

  58. Dave,
    Wow, way to totally ignore history. JSON is neither new, nor was it intended as an RPC format. It was actually baked into the ECMA 262 standard back in 1999. It’s original intent was as a shorthand way to declare arrays and objects in a Javascript program. It just turns out that combining that notation with eval() gives you a dead-easy RPC format for Javascript applications.

    Reference: http://www.ecma-international.org/publications/standards/Ecma-262.htm

    Look at the sections on “Array Initializers” and “Object Initializers”. That’s basically JSON right there, just by a different name.

    I’ve got my own opinions about some of the other arguments here, but I’m not patient enough to read through everything here to make sure I don’t repeat something.

    Reply

  59. You’re right on the money, Dave. Xml is the data format of choice– why would you build endpoints that are restricted to JavaScript? Sure, JSON makes data easy to use for the JavaScriupt challenged, but it restricts your architecture.

    These days I’m promoting service oriented architecures at both large and small scales– the web applications our teams are writing are all based on Xml data sources using GET endpoints and web service endpoints for modifying data. This gives me flexibility to wire up components that are deployed on multiple locations in the web app as well as remote portal locations like SharePoint or WebSphere, as well as rich client interfaces like Windows apps or Flash GUIs. I can also use a combination of Xslt for rendering or parsing the stream to grab additional data.

    It is a shame that Microsoft’s Atlas (ASP.NET AJAX) framework lacks a good framework for working with Xslt. Now comes the balance, there is a place for JSON I suppose, but I’m not sure what it is. I’m sure Brian has some opinions. But it shouldn’t be the central architecture for the communication layer, unless you’re only writing monolithic web apps. And if you’re doing that, why not just use ASP.NET controls and update panels? (disclaimer: that last sentence was sarcastic as all get out. ASP.NET is dead.)

    Reply

  60. Look at the source of your neighborhood JSON parser. Compare it to the source of the XML parser. Maybe just compare file sizes.

    It’s easier to keep the rubber on the road when there’s less software.

    Reply

  61. I’m a big fan of XML and XML-RPC. However, there are two reasons I’ve had to use JSON on projects:

    1) It allows web sites to receive data from third-party web sites, through the SCRIPT tag. The SCRIPT and IMG tags are the only two ways to pull content in from third-party sites, without using a server-side proxy. This is a big hack, yes, but it can work to make some interesting things.

    2) For Ajax/DHTML web apps that are getting lots of data from the server, it can be much faster in some cases to have it already in a JavaScript data structure than to have to parse it from XML.

    Both of these are hacks, admitedly, and in time I think we will see JSON disappear in favor of pure XML. In the mean time, it can be used to do some interesting tricks in the browser for more advanced web apps.

    Best,
    Brad Neuberg

    Reply

  62. By the way, because Delicious returns their result as JSON it means I could pull it into a web page directly using the SCRIPT tag and use the data for some web app; if it was XML I could not do this.

    Dave asked about actual benchmarks for JSON versus XML in the browser. I don’t have them on hand, but I have worked with clients where I directly benchmarked the performance of working over a large XML dataset using the XML parser in the browser versus using JSON. Their dataset was large, and using JSON sped things up significantly. I always do real benchmarking when making these kinds of decisions.

    In general, I’m actually a fan of XML versus JSON, but I’m not religious and if JSON is the only way to get a job done for a client I will use it.

    Brad

    Reply

  63. JSON has useful applications far outside the realms of JavaScript. I’ve written up an extended response to this discussion here:

    http://simonwillison.net/2006/Dec/20/json/

    It’s a great solution for moving simple data structures from one place to another, without the programming overhead involved in building and parsing.

    Reply

  64. A bit off topic, Dave THANKS MUCH! =) just was checkin SN on my phone and it looks real clean. No need to scroll left or right at all, content in a nicely centered column! Not sure if ya changed anything but now it looks like all the other rivers i read. Thanx again

    Reply

  65. There will always be people who will fight XML or for that matter anything.

    Reply

  66. The problem with using JSON outside of the browser is that it wasn’t really meant to do much more than be a JavaScript Object Notation. If you want something similarly lightweight, but actually well-specified, and still human-readable, you might consider YAML, which actually *is* designed to be a cross-environment data transfer language.

    I still say if there’s no browser in the mix, you don’t want or need javascript object notation, you want a “real” serialization format. Doesn’t have to be XML, but picking up something because of a quirk of browser support when you don’t even have a browser in the loop seems like an odd tradeoff.

    (The other huge advantage of shipping down an “eval”-able string to a browser is that you can actually do even *more* than just the “JSON” subset… you can call functions to actually fire events into your system or anything else you can do with code. No other tech can give you that as easily in a browser. But you’d best be sure it’s from a trusted source…)

    Reply

  67. Sorry Ryan, never mind on that last sentence. Wasn’t important.

    Reply

  68. JSON is elegant. It is especially elegant when working with JavaScript.

    RSS is elegant. It is simple and easy to create, thus it is elegant too.

    So both are a good design and I don’t know of anyone trying to do feeds in JSON. I have the JavaScript code to read RSS that my server application creates. Even then, my experience is that JSON is faster and simpler to code than doing the encode/decode XML. So personally I need both.

    And I don’t see what the big deal is. JSON came into being when JavaScript was invented as others have said. It just is>

    So chill even if it doesn’t snow in Berkley. 🙂

    Reply

  69. So while I agree with Dave on the JSON vs XML stuff, what’s up with slamming reinventors? What about RSS– isn’t that just a reinvention of RDF? And does that make ATOM even more evil? Is Java and C# evil since they’ve reinvented memory management? More comments on http://daniellarson.spaces.live.com.

    Reply

  70. Posted by Robert Krajewski on December 20, 2006 at 9:09 pm

    About XML vs. JSON when there’s a reasonable argument for both… the same “endpoint” can serve data in both formats – just send an accept header with either application/json or application/[whatever+]xml and code your server handler to pay attention to it. I beleive Ruby On Rails is doing this now (and they’ve also got YAML to worry about).

    Reply

  71. Posted by Tim Towtdi on December 20, 2006 at 9:33 pm

    “There’s more than one way to do it.” – Larry Wall? or another JAPH?

    Dave, IMHO, took the potential of XML and made the browser irrelevant… he just found a way to push and pull XML over port 80.
    It created a revolution in allow programmers to scan the assets of web servers without having to do HTML-based text scrapping.

    Now, Javascript has emerged to make the browser-web server dialog more asynchronous and dynamic… I guess JSON is an enabler for that new distributed systems paradigm.

    I can’t wait to see Dave’s summary of this whole I discussion… will he call off the lynch party or escalate for more troops to protect making the browser safe for XML? It could go either way.

    But sadly (or joyously)… TIMTOWTDI.
    Sometimes a baker’s dozen….

    Reply

  72. If Dave wants to do something about JSON, the real answer is to get the browser makers to allow web pages to talk to other third party domains in a safe way, to fetch just XML. Right now I have to use JSON to talk to third party domains; it should not have to be our data exchange format, this is a big fat hack, but its the only way we can do things like mash up Yahoo search into our page, or a delicious feed.

    Right now the way Ajax works is you have an object called XMLHTTPRequest, which allows a web page author to open a connection in the background back to the host it came from. It is relatively clean, and has opened up alot of innovation. However, because it can’t fetch XML documents from third party domains, we have to use JSON + the SCRIPT tag.

    While folks have brought up the performance benefits of JSON versus XML while in the web browser, myself included, I believe that ultimately this is a red herring and the only real reason to currently use JSON is for fetching data from third-party web sites on the client side.

    Dave, if you want to see JSON disappear, put pressure on the browser makers to allow XMLHTTPRequest to GET XML documents cross-site (but don’t send any cookies along with the request, for security reasons).

    Brad

    Reply

  73. What about a simple postfix notation? http://www.piedcow.com/blog/archives/30

    I proposed it a while back on another site in response to a post about n3 vs. XML. Some more examples/ideas are there. http://enthusiasm.cozy.org/archives/2004/10/notation-3/

    It may be a little hard to read if you haven’t paid your 25 minutes of dues on an RPN calculator… but if you have you should see the light… unless I’m missing something. Which is very probable since I’m on vacation and its late.

    Reply

  74. Posted by William Crim on December 21, 2006 at 12:27 am

    JSON can be parsed by using the Javascript eval() function. However it doesn’t have to be.

    The general idea is that you can eval() for simplicity on sites you trust, but call a parser(which effectively is a regular expression) for safety. It is language independent, and easily parsable. The http://www.json.org website has dozens of parsers.

    I suppose it is a simplicity issue. XML has lots of purposes, mostly relating to describing documents. JSON has one purpose, describing data structures in a way that maps to commonly used programming languages. It also has the benefit of being simple to use for the most common case; which is describing data for which I control the producer and the consumer.

    XML-RPC/SOAP/JSON based interoperability is like code-sharing. Theoretically useful, but practically a non-issue. The time I spend coding application-related data/RPC interchange vastly outweighs the transport related concerns. I don’t know about anyone else, but I typically use Web Service calls back to my own server, rather than slinging requests across the web. We use RPC to help distribute load or prevent unecessary postbacks. I find I use RPC to help overcome the statelessness of the Web client, rather than Web 2.0-stlye interop.

    Reply

  75. As a point of clarification: JSON is a defined subset of JavaScript. If you’re sending executable JavaScript (with function calls etc) you’re not sending data that’s conformant to the JSON standard. There’s a pseudo-spec (as in people use it but no one has written it up formally yet) called JSON-P which allows a single callback function to be wrapped around a JSON object specifically for pulling off cross-domain Ajax calls. If you put JSON-P content through a conformant JSON parsing library you would get an error.

    The most common misconception about JSON is that it’s just JavaScript. It isn’t. It’s a spec that reuses the object-literal syntax from JavaScript but with some extra limitations to make implementing a parser even simpler (for example, JSON hash keys must have double quotes around them; these can be omitted in JavaScript).

    Reply

  76. “If you put JSON-P content through a conformant JSON parsing library you would get an error.”

    You might. But the JSON spec allows processors to accept things that aren’t part of the standard. And, hey, who are we kidding? That’s reality.

    Reply

  77. Robert: good point; you’d get an error from a strict library (simplejson for Python doesn’t like invalid JSON) but it’s not required by the RFC:

    A JSON parser transforms a JSON text into another representation. A JSON parser MUST accept all texts that conform to the JSON grammar. A JSON parser MAY accept non-JSON forms or extensions.

    http://www.ietf.org/rfc/rfc4627.txt

    Reply

  78. Posted by Disgusted of Tunbridge Wells on December 21, 2006 at 2:32 am

    Why, oh why, oh why did they invent XML? There is nothing wrong with good old fashioned S-Expressions. This data format diversity is political correctness gone mad.

    Reply

  79. Posted by Orion Edwards on December 21, 2006 at 3:36 am

    I’m sure nobody’s going to read this, but IMHO, JSON is a good thing. Not being XML is a good thing. Why?

    Well, here’s a hash in ruby (the day job), which is how I’d generally pass some simple data from A to B without needing to constrain it to a declared interface/class/etc.

    { ‘key’ => ‘value’, :otherkey => [1,2,3,4] }

    Here’s it in JSON

    { ‘key’: ‘value’, ‘otherkey’: [1,2,3,4]

    Now if I fire that down the wire to another client, written in say, PHP, here it is:

    array( ‘key’ => ‘value’, ‘otherkey’ => array( 1,2,3,4 ) )

    I could even write it as similar in .net or java, and even if I chose to constrain it to types, it is trivial to loop over the array and convert each item to strongly typed objects and so forth.

    Now, here’s the same thing in XML… Not even SOAP or XML-RPC or anything, just the simplest cleanest XML I can write to try represent that structure.

    keyvalue
    otherkey
    123

    Now, can you see why JSON is a good thing? Sure ALL of the above examples are human readable, just some are more readable than others… I actually happen to think the ruby hash is the best one, but JSON is not far behind.

    Reply

  80. Posted by Orion Edwards on December 21, 2006 at 3:38 am

    ARGH HTML ate my angle brackets. The ‘xml’ example above should be:

    …..Now, here’s the same thing in XML… Not even SOAP or XML-RPC or anything, just the simplest cleanest XML I can write to try represent that structure.

    <array>
    <item><key>key</key><value>value</value></item>
    <item><key>otherkey</key><value>
    <array><item>1</item><item>2</item><item>3</item></array>
    </value></item></array>

    Now, can you see why JSON is a good thing? Sure ALL of the above examples are human readable, just some are more readable than others… I actually happen to think the ruby hash is the best one, but JSON is not far behind.

    Reply

  81. The purpose of JSON is JSONP. It should always, imho, be provided alongside an XML API. The benefit of JSONP is simple : it is dead-simple to use in JavaScript apps, and there are a lot of JavaScript apps. Using it in PHP is mostly only useful to produce it. Certainly it will never replace XML.

    Reply

  82. Posted by Erik Terpstra on December 21, 2006 at 6:49 am

    What’s the big deal? REST is just an architectural style, so the del.icio.us API can easily support XML as well.
    You just use different Accept headers for your preferred format:

    curl -iH ‘Accept: application/xml’ http://some/uri
    curl -iH ‘Accept: application/json’ http://some/uri
    curl -iH ‘Accept: text/plain’ http://some/uri
    curl -iH ‘Accept: text/html’ http://some/uri
    curl -iH ‘Accept: application/ruby’ http://some/uri

    REST doesn’t need an Object serialization format, just choose one (or more) that makes your users happy.

    Reply

  83. Posted by neville roxby on December 21, 2006 at 7:00 am

    Its simple, JSON is natively supported in JavaScript, it is perfect for client side scripting. As long as a users browser can use javascript, then it can use JSON. XML does not work the same way in Javascript as JSON does, and the myriad of different browsers out there support XML in JavaScript in very different amounts. therefore JSON is an excellent way of communicating between browser and servers. XML is messy at best.

    Reply

  84. Stephen: You’re right that JSON will never replace XML – there are plenty of applications for which XML is well suited and JSON isn’t. However, there are also lots of applications outside of JSONP where JSON is a better fit than XML. Best tool for the job and all that…

    Reply

  85. Brand:

    I guess that you haven’t followed Douglas’s most recent proposals, JSONRequest and . I talk about them here:
    http://www.ashleyit.com/blogs/brentashley/2006/10/30/secure-ajax-mashups-by-design/

    and here:
    http://www.ashleyit.com/blogs/brentashley/2006/10/29/quite-the-experience/

    As I say in one of the posts:

    “Even if Douglas’s proposals don’t end up being the solution to these problems that is implemented , I believe that he has provided the most comprehensive place to begin discussions towards fixing up the browser to be a place that was purposefully designed for mashups.”

    I agree that it’s up to the development community to convince the browser makers to make these forward-looking changes.

    Reply

  86. oops, that was for Brad (Neuberg) of course.

    Reply

  87. http://en.wikipedia.org/wiki/Json explains it pretty well. In JS and Ajax context , standard XML libs and techniques can be too complex in comparison to alternatives. When you look at Rich internet apps that have to support a mobile tier of clients and then become concerned with cross-browser issues and client side XML libs for parsing , extra complexity becomes big issue. Json is simple.

    ON the server, adding a Json converter as a handler in a filter-chain, just before the serialized, outbound stream goes “on the wire” is not a big deal. And , it can serve to keep things simple in the JS layer on the client where it really matters.

    Reply

  88. Say what you want about JSON. It has it’s downfalls like everything but it is rather easy to work with both from Javascript other languages

    What to use Yahoo! Web Search from python it is one line (admittedly a long one but still)

    def web(x):
    return eval(urllib2.urlopen(‘http://api.search.yahoo.com/NewsSearchServi
    ce/V1/newsSearch?appid=YahooDemo&query=%s&results=10&language=en&output=json’ %
    urllib.quote(x)).read())

    Yahoo! News or Local is not essentially the same one line. And what you get back is a rich data object. Using JSON – I am able to write one liners that are very powerful and let me move along to doing real work.

    Recently, I developed a Web API for a search engine I built and it only returns JSON and a number of people are using it and nobody has yet complained that they want XML or OpenSearch RSS. ( http://api.futef.com/apidocs.html for more information )

    Additionally, I don’t really understand the argument. I agree SOAP and WS-anything are evil and will die under itheir own weight. It was classic top down, commitee drvien corporate sponsored crap. JSON like XML-RPC is not from company or a commitee and as far as I can tell their isn’t really even a serious fanboy community . People like and it is useful – I am not sure where the anger is coming from other than I didn’t invent it (which is ironically very much the attitude of the people, Mr. Winer is complaining against). I do agree with some of Mr. Winer’s sentiments but believe they are misguided in this case – but hey I am from NYC so what do I know.

    Reply

  89. Posted by Ben on December 21, 2006 at 9:25 am

    Valid JSON (without comments) is YAML: http://redhanded.hobix.com/inspect/yamlIsJson.html

    So, it’s not just for Javascript.

    Reply

  90. Afaik, JSON doesn’t even handle native JavaScript Date objects. So far for the JavaScript support.

    Furthermore, everywhere it is recommende not to evaluate JSON with “one line of JavaScript”. Only a fool would give that much trust to remote code.

    However, I see some benefit here; it’s a fat-free alternative, indeed. It runs for competition and the bald, grumpy, old men always told me that competition is good. So why not sit back and relax?

    Reply

  91. I submit that web development is entering the mainstream as the serious work of rebuilding applications to be web native begins. We are in the “let a thousand flowers bloom” stage as developers acquire the skills to build these applications. One of my friends calls this the technical tower of babel and he is right on.

    Contracts versus convention, open versus proprietary, light versus heavy, REST versus SOAP vs whatever is next, desktop versus AJAX, this will go on and on and the market will sort it out in the usual messy way that human beings use to create a consensus.

    The is something real and important about all of this and you have to love the passion that all sides bring to the argument.

    Reply

  92. Posted by David on December 21, 2006 at 10:30 am

    I like JSON better than XML because it is more representable in most languages than XML. Arbitrary JSON can be converted to native objects in Perl, Python, Ruby, Javascript, Lisp, and I’m sure a few more. By native structures I mean Hashes and Arrays. XML loses information if you try to convert it into straight hashes or arrays.

    Consider <item meta=”hi”><key>a</key><value>b</value></item>

    It seems natural to want to have the representation end up as (in perl, for instance):
    { item => { a => b } }

    But what I’m losing info there. This mirrors the xml better but isn’t as convenient:
    { item => { key=>”a”, value=>”b” } }

    But it’s still missing the attribute. Where do you put that? You have to tell it apart from the child nodes so it can’t just go inside like:

    { item => { meta=>”hi”, key=>”a”, value=>”b” } }

    If you don’t want it to lose information then it you have to add layers which messes everything up even more. Essentially you have to use DOM or XPath to get stuff out of the XML tree, which is nice, but doesn’t fit in with the rest of the program.

    JSON, however is actually a more restricted form of data. BUT, it’s restricted in a useful way since it’s based on what Javascript can easily represent with its native hashes and arrays which means that almost all modern scripting languages are able to handle the data that it represents in a native way. Which means, hands down, it’s easier to work with than XML.

    Other comments have mentioned all the other things I like about it. It’s shorter, and it just %$#@! looks better. Data enclosed in curlies or parentheses is always going to look better than <somehugetagname>
    data</somehugetagname>–there’s just no debate.

    -David

    Reply

  93. We use SOAP and JSON in our web application. Our architecture speaks SOAP between the WebServer and our application repository but the conversation between the WebServer and our AJAX UI components uses JSON.

    The reason is that JSON is basic javascript and is supported in the same way in every browser. XML is implemented differently in every browser. So, you either need something like Sarissa which handles the differences for you or you need to write code specific to every browser.

    In a world that is not dominated by a single browser (give thanks!), anything that has to be handled differently causes extra complexity and work.

    Reply

  94. I couldn’t read all of that discussion. I just had to say that (a) JSON’s super easy to handle in pretty much any language (what, a hash? RUN IN TERROR!), and (b) providing an API in one format does not preclude the option of providing an API in another format, so to say that this is a ‘bad move’ is simply ignorant. If you’ve written a JSON API and you recognize that it takes like fifteen minutes tops (esp. for something as basic as delicious) then you’d recognize that there’s nothing to be scared of. Reinventing the wheel is okay when the wheel is slightly more convenient (cheaper, whatever) and fits your purposes completely, right? Who needs a Goodyear for a scooter?

    Reply

  95. Does anyone know how to select nodes like xPath for JSON? I know there is JSONT, but it does not seems mature.

    Reply

  96. Posted by Masklinn on December 21, 2006 at 1:08 pm

    Stephen Paul Weber said,
    > The purpose of JSON is JSONP.
    No it’s not, the purpose of JSON is JSON, JSONP is something else and, even though it’s built on JSON, it’s not JSON.

    > Certainly it will never replace XML.
    Let’s hope not, we don’t need an other XML. The point of JSON is to be a dead-simple and lightweight serialization language, not a general purpose one-size-fits-all gigantic container to fill with whatever you want.

    XML is nothing but potential (you can do whatever you want with XML, but XML in and of itself gives you nothing until you write your own dialect, or use an existing one), while JSON is an end-product.

    neville roxby said,
    > Its simple, JSON is natively supported in JavaScript, it is perfect for client side scripting.
    JSON is also very good for lightweight data transferts, when you don’t need extremely complex formats.

    derek said,
    > Say what you want about JSON. It has it’s downfalls like everything but it is rather easy to work with both from Javascript other languages
    > What to use Yahoo! Web Search from python it is one line (admittedly a long one but still)

    Am I dreaming or are you `eval`ing a random string you got from the net _in python_?

    For the love of god, please use a JSON lib (simplejson is very good), what you’re doing here is awfully dangerous

    p3k said,
    > Afaik, JSON doesn’t even handle native JavaScript Date objects. So far for the JavaScript support.
    I fail to see the relation between your two statements. There is no equivalent to Javascript’s Date object in JSON, so what? How the hell can you conclude that Javascript therefore doesn’t support JSON, or that JSON therefore doesn’t work in javascript? I’m confused, there is no logical link between your two statements.

    Reply

  97. A lot of people were using JSON before it was called that. Everyone passed their variables to javascript by writing out javascript code. JSON was just the documentation of how to do it.

    Emitting JSON code from your service is no different than, say, emitting HTML from your service. (GADS!) Instead of sending out your serialized object as XML, for parsing, why not just send out formatted XHTML, with some microformats? it is easier to use, because it is delivered in the proper format for the browser, and formatted for a specific context.

    Here’s a kind of roundabout example. One “technique” I used a long time ago was to instantiate javascript objects by emitting javascript code. That is, instead of sending a JSON array, and then looping over it to make objects, I would just emit the javascript code to make the objects:

    var ar = [];
    ar[0] = new obj(1,’abc’,43);
    ar[1] = new obj(43243,’def’,3432);
    ar[2] = new obj(55,’wer’,1102);

    That would show up in the browser. It wasn’t JSON-y, but it was pretty efficient, and easy to debug. Yes, it’s not neutral, but, sometimes, neutrality isn’t important, or it’s just somewhat important.

    I definitely didn’t need the neutrality of XML, and not anything as elaborate as XML-RPC. This wasn’t a public interface.

    (In fact, when I tried to use the technique it as a public interface, it failed to scale, topping out at 50,000 infrequent users. It failed to scale, because too much configuration was put at the server side, when it should have been left on the client side. The correct way could have been found using something like JSON or maybe XML-RPC, but this was a while back in the old browser days.)

    I’m not so hot on JSON-RPC. Despite what people here have said, I think JSON notation is not really simpler than XML. JSON is nice and terse, and good for very uniform structures, or simple things like arrays of strings, but when you get into aggregated objects, it’s not going to be easy to read. The whole point of using XML is to transparently preserve the hierarchy so the programmer can write a client to consume the object. You’re not going to get that with JSON.

    { ‘a’: 123, ‘b’: 345, ‘c’:[ ‘a’: { ‘x’:123, ‘y’:200}, ‘d’:{ ‘x’:123, ‘y’:200}, ‘e’:{ ‘x’:123, ‘y’:200}, ‘f’:{ ‘x’:123, ‘y’:200}, ‘g’:{ ‘x’:123, ‘y’:200} ] }

    Okay, that’s a contrived example, but, it’s hard to read. Even with pretty printing, it isn’t going to be very easy to read. It’s not a nice API. It’s almost a proprietary data format.

    Reply

  98. TurboGears has nice support for JSON, in that its web-exposed methods return Python dictionaries, which are usually plugged into XML Kid templates, but can also be returned directly to the client as JSON data.

    There’s a “jsonify” package that lets you register type converters that translate from more complex data types (like dates, Python classes, or SQLObject and SQLAlchemy objects from the SQL database) into simple flat JSON supported types.

    For example, here’s a converter function that is decorated with a “@jsonify.when” expression, that registers it to be used to convert SQLAlchemy/SQLSoup objects to JSON. So now my TurboGear controller methods can directly return objects from the database as JSON, or Python dictionaries and arrays indirectly containing any number of SQLSoup objects.

    @jsonify.when(“hasattr(obj, ‘c’) and (obj.c.__class__.__name__ == ‘LOrderedProp’)”)
    def jsonify_saobject(obj):
    props = {}
    for key in obj.c.keys():
    props[key] = getattr(obj, key)
    return props

    -Don

    Reply

  99. Posted by Masklinn on December 21, 2006 at 4:13 pm

    > Okay, that’s a contrived example, but, it’s hard to read (and malformed, too).

    It’s not contrived, it’s forcibly stupid, doing the same kind of things in XML becomes downright unreadable:
    <root> <a>123</a> <b>345</b> <c> <a x=”123″ y=”200″/> <d x=”123″ y=”200″/> <e x=”123″ y=”200″/> <f x=”123″ y=”200″/> <g x=”123″ y=”200″/> </root>

    Dont blame the format when you’re going through numerous pains to make the output unusable. Please.

    > The whole point of using XML is to transparently preserve the hierarchy so the programmer can write a client to consume the object. You’re not going to get that with JSON.

    What?

    May I ask what the hell stops you from creating hierarchies in JSON if that rocks your boat? Just cascade objects or Arrays and you’re done, case closed…

    Reply

  100. Ok….

    I have to say something to all the above who keep supporting JSON.

    People keep using the Acronym AJAX along side JSON. If you use JSON and not XML you don’t have an AJAX Application. Because that wouldn’t make sense, since AJAX is about XML And Javascript.

    Just want to clarify this. I see people starting arguments towards a Web 2.0 standpoint. But this is more about coding structure and hierarchies. It is a XML vs JSON Conversation.

    Thank You.

    …sigh

    Reply

  101. And Another note.

    I see people saying that JSON is easier to parse than XML. Maybe so. But how much quicker? Eighth of a second? Does it really matter? I think sure for the sites overall performance but I would look more at the HTML coders ability to produce one good w3c valid document which will lower load time rather than waste my time on a few bytes extra added to my over head because there are more carriage returns in one format compared to another.

    Reply

  102. Posted by casey on December 21, 2006 at 10:46 pm

    Why does the author assume that JSON is intended to reinvent the wheel.

    If XML was efficient to parse,easy to use, and integrate into Javascript, probably no one would have ever seen a need for JSON.

    The fact is, representing free-form data like a dictionary is an impedance mismatch for XML. JSON fits, and when it does, I’ll use it.

    I wouldn’t presume to try to fit it into every problem as a be-all-end-all solution. I think XML has done enough damage in that space.

    Reply

  103. Posted by Masklinn on December 21, 2006 at 11:43 pm

    Frank Cefalu said,
    > People keep using the Acronym AJAX along side JSON. If you use JSON and not XML you don’t have an AJAX Application. Because that wouldn’t make sense, since AJAX is about XML And Javascript.

    This is an argument of retardation, please don’t use it, the term “AJAX” has been progressively emptied of most of its meaning during the last 24 months and has become no more than a buzzword indicating remote calls done in javascript.

    And sometimes not even that since a lot of people consider stuff like drag&drop or visual effects to be part of “AJAX”.

    Whether you use HTML, XML, JSON or Plain Text as a response medium doesn’t change anything

    Frank Cefalu said,
    > Does it really matter?

    I don’t know, I know that I use JSON because it’s lightweight, it has a clear typing handled by the language itself, and — most importantly — *it requires me to write much less code*

    Reply

  104. >> The whole point of using XML is to transparently preserve the hierarchy so the programmer can write a client to consume the object. You’re not going to get that with JSON.

    >What?

    >May I ask what the hell stops you from creating hierarchies in JSON if that rocks your boat? Just cascade objects or Arrays and you’re done, case closed…

    Nothing really prevents me from using cascading arrays, but objects serialized into xml are going to tend to be easier to understand than arrays serialized into JSON. For one thing, the xml can be copied over into an editor or browser that will pretty-print the xml for you. There’s also social pressure to use meaningful xml tags, because the client consuming the data might be an artist using Dreamwaver; so programmers *might* try to add semantic info to the data.

    (Eventually, we’ll have JSON pretty printers and annotation techniques, but it would start to feel like bloated xml.)

    In situations where semantics matter, it’s usually because the data server and clients are in different departments, projects, or organizations. They’re probably communicating over busy networks, so the client should be caching the data. In this situation, xml’s size is not a liability, and it’s semantic detail is an asset, especially to the programmer writing the client code.

    There are other situations where xml is better than JSON or a native format; config files, or archiving data, for example. Someone other than the original programmer has to deal with this data, so xml’s nice.

    I’m not an xml bigot. I already use JSON more than xml, and the built-in serializing functions more than both of those. The different techniques are good for different uses. Contrary to what others have said, differences *do* make a difference. That’s why JSON should not be used when xml is better, and vice versa.

    Reply

  105. “Dan, where are the benchmarks that say that on a processor capable of running Flash apps or playing Youtube videos that it can’t parse XML “cruft.” If you’re going to use an engineering argument, be prepared to back up t your assumptions with data. I’m pretty sure you can’t because I ran my own benchmarks on the end-user CPU of the late 90s and the overhead of parsing XML was negligable then, and now..? Geez Louise. How many gigahertz to people have these days??”

    When it comes to Javascript and browser-based communication. The difference in processor power is negligible when compared to the overhead of transporting information over the internet. Without needing to get into any benchmarks, you can assume JSON is faster because of the following:

    JSON is native to Javascript. Parsing a JSON response is a matter of using an eval() on a string. Otheriwse, using XML-RPC or any other XML-based envelope, the user would have to download extraneous code to strip out data from a DOM object into native javascript variables, which can grow pretty fast when you consider the idea of looping through the results of document.getElementsByTagName(‘tag’) over and over again.

    JSON is lightweight. The amount of text involved in wrapping data in XML is likely greater than if it were wrapped in JSON. This is easily measurable by comparing equivalent statements in JSON and XML-based specs (like XML-RPC or, god forbid, SOAP).

    JSON is less work on the programmer. It’s easier to work with than most anything. Especially when it comes to something like AJAX.

    When it comes to browser-based communication, I don’t even know why there’s a question.

    Taken out of context, sure, JSON is hard to parse. But so is XML without the use of SAX or DOM. Even with a good XML toolset, working with SOAP or XML-RPC still requires the use of helper libraries (when you don’t want to re-invent the wheel). When it’s abstracted that far, what difference does it make? Why bitch and moan about it?

    Reply

  106. “People keep using the Acronym AJAX along side JSON. If you use JSON and not XML you don’t have an AJAX Application. Because that wouldn’t make sense, since AJAX is about XML And Javascript.”

    You’re right. JSON doesn’t fall within “Asynchronous Javascript and XML”. But AJAX does imply the use of xmlHTTPRequest(), which doesn’t actually require XML.

    Besides, who cares?

    Reply

  107. Posted by Henry on December 30, 2006 at 8:22 am

    JSON is a kludge because for some unfathomable reason, Javascript has the equivalent of Lisp’s ‘eval’ , but not ‘read’. And for some further unfathomable reason, nobody seems to understand this simple yet important difference.

    Reply

  108. I don’t have time to read through all of these responses, but I simply want to say that JSON is simply JavaScript literal object notation, which has been around for a very long time – no one created this out of thin air.

    And to various uneducated responses:

    – JSON is not a kludge, it is basically JavaScript literal object notation – go do some reading.

    – ‘Ajax’ can be ‘Ajax’ without XML (people need to go read the original article (http://www.adaptivepath.com/publications/essays/archives/000385.php), apparently too many take the acronym too strictly)

    Reply

  109. Posted by HeroreV on January 7, 2007 at 2:47 am

    E4X makes XML easier to work with than JSON. If Internet Explorer supported E4X almost nobody would use JSON.

    Reply

Leave a comment