Showing posts with label web. Show all posts
Showing posts with label web. Show all posts

Monday, July 29, 2013

JSON vs. XML: Holy Wars and Container Elements

I've been working on a demo based on David Webber's CAM/Open-XDX demo for Prescription Medication Checking (PMIX). Open-XDX offers an easy way to set up a schema-based REST API for XML and JSON. I am building a mobile app with PhoneGap, AngularJS and Saxon-CE.

Originally, my demo mobile application used Saxon-CE with a bit of JQuery to coordinate Ajax calls. I soon replaced JQuery with AngularJS because I wanted a chance to use this framework, and because I liked the name "AngularSaxon". Meanwhile, I was also figuring out a way to embed a Firefox OS build within an Android PhoneGap build.

It soon became apparent that Angular, at 80K, could perform many tasks faster than Saxon-CE, at 800K, and since I had both JSON and XML available, I started shifting my focus, to allow a time trial between the two wire formats. One thing I discovered is that, whereas XSLT can handle something like:

   <Prescription id="1">...</Prescription> 
   <Prescription id="2">...</Prescription> 

...in the specific case where there is only one element, in JSON, this surfaces as the difference between an Array and an Object, which results in problems using Angular's ng-repeat directive. I was able to create a workaround in my JavaScript objects, as detailed below.

I had already created an Angular Service to handle rendering in both XSLT, using Saxon-CE, and in JSON. This Rendering Service already contained an exposed copy of the data returned from an API call, triggering an Angular Controller when the data is ready. I added the following method to the Rendering Service to insert an absent Array for a container element:

   renderingService.fixJSONContainer = 
      function(parentNode, childName) {
         if (parentNode !== undefined) {
            var childNode = parentNode[childName]
            if (!Array.isArray(childNode)) {
                parentNode[childName] = new Array(childNode)       
            }
            return parentNode[childName]
        }
        return []
    }


This helper method is available to the Response Controller, since it has already been Dependency Injected. In the Response Controller, I made the following changes to the render method for JSON:

   $scope.$on('renderJSON', function() {
    $scope.fixJSONContainers(renderingService.data)            

    $scope.pmix.resp.prescReport = renderingService.data[...]
    $scope.pmix.response.prescriptions =
       $scope.pmix.resp.prescReport['pmp:Prescription']
  });
         

  $scope.fixJSONContainers = function(data) {
     var prescriptionArray = 

        renderingService.fixJSONContainer(data[...], 
        'pmp:Prescription')
     for (prescriptionNode in prescriptionArray) {
        prescriptionDrugArray = renderingService.fixJSONContainer(

           prescriptionArray[prescriptionNode], 
           'pmp:PrescriptionDrug')
     }
  }


The local method fixJSONContainers is specific to my schema, and requires intimate knowledge of the schema; where for instance, the CAM elements have makeRepeatable. Other than that, this solution is generic. In my HTML View, Angular handles the data-binding using ng-repeat directives:
 
   <ol id="prescriptions">
      <li ng-repeat="prescription in pmix.response.prescriptions">
      <h3> Prescription #:
         {{prescription['pmp:PrescriptionNumberText']}}
         ({{prescription['pmp:DrugRefillNumberCount']}})</h3>
      <div ng-repeat="prescriptionDrug in 
          prescription['pmp:PrescriptionDrug']">
          <div> Prescription: 
              {{prescriptionDrug['pmp:DrugProductNameText']}}
              - {{prescriptionDrug['pmp:DrugStrengthText']}}
              - {{prescriptionDrug['pmp:DrugUnitOfMeasureText']}}
          </div>
       </div>
    </li>
 </ol> 
 
Once the changes have been made to the underlying Services and Controllers, the HTML View is very tight and concise, which is part of the magic of Angular. Autowired data-binding takes care of the rest.

Monday, July 15, 2013

JavaScript: The Cake and Eating it Too

In my previous post, I started to talk about some of the things I appreciate about Angular, and many of these things, I also like about technologies like Thymeleaf as a JSP replacement, Scala as a Java/Spring replacement, PhoneGap as a native mobile replacement... the list goes on, but what all of these have in common is that (with the exception of Scala, and I will get to that eventually) they leverage and enable the potential of HTML5 by using HTML5 as template and view, a purpose for which it is well suited. In short, these technologies are all  replacements for approaches that are harder to work with, and all come with a self awareness that their purpose is to make themselves obsolete as more standardized functionality comes built in the browser and server alike.

This makes me very happy, of course, because I am excited about the lightweight native mobile web approach promised by Firefox OS and other browser based platforms for the web and mobile. Why is this exciting? Because the developer community for native apps built in HTML5, CSS and JavaScript with a decent framework is huge. I would hazard a guess that Angular JavaScript could be taught in schools very easily, and deployed onto mobile devices, shared with other students using github... it's a good time to be exploring new technologies.

My own personal preference is a sort of best of breed client layer formed by layering Angular JS and then Saxon CE over Cordova (PhoneGap), which I have been referring to as "SaxonCord", although the more I use the functionality built into Angular, the less I rely on Saxon CE as a client layer. For some things, like SVG manipulation or working with XML messaging standards like HL7 CDA or NIEM, Saxon CE is a rockstar; however, these standards are becoming more open, and in doing so, have started to embrace JSON, at which point the autowired data binding Angular provides is really all you need to inject data into a page. I have used client-side Inversion of Control frameworks in Adobe Flex; Angular is just plain easier.

Sunday, July 14, 2013

JavaScript: The New Glue

I don't work for Google: but let's talk about AngularJS.


Specifically, I have been working with the JavaScript framework AngularJS recently. When the web was very young, it served static pages, for which HTML was appropriate because browsers facilitated looking at a page and then looking at the underlying markup; learning by view source was viral. Then the web started getting more dynamic, and technologies like JavaScript were introduced to allow for greater interactivity, but they never lost their reputation as "glue code".

Static webpages are a thing of the past. A mature JavaScript framework like AngularJS enables the browser to behave dynamically without hiding a lot of functionality in controllers; this means that the HTML code should still be easily understood at the level of view source. This is what I mean when I refer to the New Glue. JavaScript has evolved beyond "glue code": the real magic is in the autowired data binding provided by the underlying framework

This is nothing new, of course, but I do think it's important to note that use of a framework should allow you to do simple dynamic web pages without a lot of code; CSS can provide some of this as well. If you find yourself writing a lot of controller code for a dynamic webpage or mobile application, ask yourself if you really need this level complexity, or are you coding around the automagic?

The level of investment in a JavaScript framework is very low. Using a best of breed combination of Backbone, Marionette, JQuery, Underscore and so forth can be used to create great results, but requires a reference implementation, and some extra glue to coordinate between different libraries. With a framework, all you need to do to get started is to reference the minimized script from a remote server, and build from there. This really leaves no excuse not to get started exploring frameworks like AngularJS now, and share your work.

#AngularSummer

Friday, April 05, 2013

Great interview with John Resig, discussing jQuery, Khan Academy, and the ubiquity of Selectors API.

http://net.tutsplus.com/articles/where-in-the-world-is-john-resig/

Wednesday, January 11, 2012

Tim Bray on dynamic typing, Android, Java

See, this is why I respect Tim Bray's opinions so much; because he is a tireless member of my post-SGML/functional programming tribe. For instance, see this post on static vs. dynamic typing, and why it's not such a big deal with mobile Java for Android. I particularly like this comment, though:

"From: Tim Converse (Dec 29 2011, at 10:32)

"The Java language in particular suffers from excessive ceremony and boilerplate. Also it lacks important constructs such as closures, first-class functions, and functional-programming support."

This is a very concise version of the case for Scala over Java."

Bingo.
ongoing by Tim Bray · Type-System Criteria
Starting some time around 2005, under the influence of Perl, Python, Erlang, and Ruby, I became convinced that application programs should be written in dynamically-typed languages. You get it built f...

Wednesday, September 22, 2010

Ontology of Dream Landscape

A couple things I have been thinking about recently, which come together in this: even dreams typically have a location, but it is a unique quality of dreams, at least the ones I have been having lately, to feature a location in isolation, that is separated from character or context; and: in matters of taxonomy, more than three levels is seldom viable in practical terms, but two is seldom sufficient.

In the work I am currently involved in developing a financial application, I see a three-level vocabulary emerging which I have witnessed in other domains, typified as category, type and subtype.

If I was attempting to describe an ontology of dreams, therefore, I imagine I would use a category of "location", a type of location name or "realm", and a subtype describing each specific "locale" within the realm. So, for instance:

/location/a_forest/one_of_many_paths

What I would like to do is build an API, attached to a cloud storage, to allow people to describe their own dream landscapes in these terms. More on this as it develops. Please comment as you see fit.

Friday, August 27, 2010

Context, content and getting over ourselves...

I am a huge fan of Lucas Gonze's weblog, where he wrote something recently which strikes me as quite profound.
Keep music from the web in the web. Don't go to a music blog, download a track, and then listen in iTunes.
Instead, he advocates bookmarking and playing music in the page that contains it, once again returning to fundamental link between URI and resource, between index and content.

What, for that matter, is a Content Management System? The term is a necessary evil; it's not like it is meaningless. But when you use this term to refer to WordPress or Blogger, I get an uneasy feeling, and reading Gonze's comment really cemented for me the reason why. The text on the page in front of you? It's not content. It's context. The page may provide content, but it is itself a context for whatever content it provides.

More on this later, just passing around the lightbulb moment, as it were.

Tuesday, August 18, 2009

Thoughts on the WebFinger Protocol

This comes as a response to Dare Obasanjo's post Some Thoughts on WebFinger and Personal Web Discovery. I am not going to summarize what WebFinger does, other than to paraphrase, WebFinger allows you to associate more of your identity with your email address. Smart, right?

Dare suggests that WebFinger might be more useful in making your online identity portable, rather than for its intended usage for end users. Which I agree with. I would like to keep all of my online identity in one place, but I have to take issue with the use of an email address for any purpose other than sending and receiving email (and I admit, I use my gmail address for plenty of authentication out of necessity and convenience), because it encourages and softens people up for abuse by the password anti-pattern.

If there's one thing I appreciate about Facebook, LinkedIn and their kind, it's that they shield people from my email. I don't want to ever give anyone my email, because I want to be able to turf it if I need to, at which point, people can still find me on Facebook etc. But, it's true, having an uncommon name is a mixed blessing. WebFinger seems like a good idea, but it also sounds kind of like it's grooming people for the password anti-pattern. We should be telling people "Don't give away your email, don't give away your email password..."

(From my comment on Dare's blog)

This what my daughter thinks about gatekeepers:

Thursday, April 05, 2007

It's a Spider... Man! The friendly neighborhood internet:


Is the web becoming more collaborative? More semantic? metaphoric? I would expect in the coming days, all these things come to pass. And most people will fail to notice. The original hypertext transfer protocol was groundbreaking because it filled a niche, and did so in a remarkable fashion. For early adopters, that was enough, and out of the aether, or arpa, the web was born; but the sea-change occurred when the web ceased to be primarily textual, and became visual.


Currently loving on:

Mason Proper - Rest Up
(live in the WOXY lounge)

I still can't get enough Mason Proper, and their latest appearance in the WOXY lounge, their second, sounded great. I have Rachael from Underrated to thank for tipping me off to Mason Proper, so props!

I am lead to believe that the next global shift in the web will be visual also, when 3d replaces 2d. Not sure if Second Life fills the bill. I could be wrong, though, web 2.0 represents a decentering of the web object, shifting focus to the audience, and though web 2.0 applications do tend to share a visual look (well, rounded corners, obv, and tableless design), this is mostly stylistic, an attempt to "look 2.0".

My personal take is that the emerging web is still primarily solipsistic and protective, but I see the self-sustaining nature of WikiPedia becoming more pervasive as online identities become less anonymous.

Peter Parker was a reporter. Perhaps in the 21st C, he'd be a blogger. As his secret identity Spiderman, he maintains law and order in his friendly neighborhood, even though sometimes he has a hard time explaining how Peter Parker happened to be on the scene when he shows the photographs to J. Jonah Jameson later.

For the most part, the web is a friendly neighborhood, and Spiderman has a lot of help. I've seen plenty of flame-wars end with a troll banned or otherwise ejected, and the neighborhood returns to order. And people stand up for each other on the web, and for the things they believe in, against the tyrannies of homophobia, misogyny, civil liberties or any other unacceptable behavior. And I am proud of these people.

I think there is a tipping point in one's life when one gains a certain notoriety for one thing, and then applies that notoriety to reaching a wider audience.

And therein lies something I find problematic: Spiderman is notorious, Peter Parker is not. But Spiderman is the facade. We live in a time when anyone with a PC and an ISP can create a web presence online (myspace page etc), and the only reason many choose to remain anonymous is because that is the norm. A forum could decide that it would only allow posts from members who have created a web presence (an online location to associate with this person if they act in a destructive fashion; a personal namespace) - at this point, true dialog begins to emerge from identification.

Of course, this ability exists already, but is not a norm, and this sort of dialog runs against the extreme virtualization of Second Life and its ilk. Wikipedia and Digg are other examples of applications which successfully blend anonymity and collaboration.

I am increasingly enjoying Danah Boyd's Apophenia Blog. At the top of her blog roll is a little notice:

Welcome! If you're new, please check out Best-Of Apophenia. A feed for this blog is here.

This is fantastic! The Best-of link is a great idea! I can't count the number of times I've come across a weblog, liked what the person has to say, but been unable to really get a bead on where they're coming from, ideals etc. Putting a Best-of link right up front tells new and familiar visitors alike what you consider to be the writing that has best conveyed What you are trying to say. The words you choose are like the clothes you wear when you travel around the web-o-sphere - these are your red and blue spandex, so why not make them noticeable?