In Part 1 and Part 2 , we have presented Json combinators for
Reads[T]
,Writes[T]
andFormat[T]
. So now you should know how to validate JSON and convert into any structure you can write in Scala and back to JSON. But as soon as I’ve begun to use those combinators to write web applications, I almost immediately encountered a case : read JSON from network, validate it and convert it into… JSON.
Introducing JSON coast-to-coast design
Are we doomed to convert JSON to OO?
For a few years now, in almost all web frameworks (except recent JS serverside stuff maybe in which JSON is the default data structure), we have been used to get JSON from network and convert JSON (or even POST/GET data) into OO structures such as classes (or case classes in Scala). Why?
- for a good reason : OO structures are “language-native” and allows manipulating data with respect to your business logic in a seamless way while ensuring isolation of business logic from web layers.
- for a more questionable reason : ORM frameworks talk to DB only with OO structures and we have (kind of) convinced ourselves that it was impossible to do else… with the well-known good & bad features of ORMs… (not here to criticize those stuff)
Is OO conversion really the default usecase?
In many cases, you don’t really need to perform any real business logic with data but validating/transforming before storing or after extracting.
Let’s take the CRUD case:
- You just get the data from the network, validate them a bit and insert/update into DB.
- In the other way, you just retrieve data from DB and send them outside.
So, generally, for CRUD ops, you convert JSON into a OO structure just because the frameworks are only able to speak OO.
I don’t say or pretend you shouldn’t use JSON to OO conversion but maybe this is not the most common case and we should keep conversion to OO only when we have real business logic to fulfill.
New tech players change the way of manipulating JSON
Besides this fact, we have some new DB types such as Mongo (or CouchDB) accepting document structured data looking almost like JSON trees (isn’t BSON, Binary JSON?).
With these DB types, we also have new great tools such as ReactiveMongo which provides reactive environment to stream data to and from Mongo in a very natural way.
I’ve been working with Stephane Godbillon to integrate ReactiveMongo with Play2.1 while writing the Play2-ReactiveMongo module. Besides Mongo facilities for Play2.1, this module provides Json To/From BSON conversion typeclasses.
So it means you can manipulate JSON flows to and from DB directly without even converting into OO.
JSON coast-to-coast design
Taking this into account, we can easily imagine the following:
- receive JSON,
- validate JSON,
- transform JSON to fit expected DB document structure,
- directly send JSON to DB (or somewhere else)
This is exactly the same case when serving data from DB:
- extract some data from DB as JSON directly,
- filter/transform this JSON to send only mandatory data in the format expected by the client (for ex, you don’t want some secure info to go out),
- directly send JSON to the client
In this context, we can easily imagine manipulating a flow of JSON data from client to DB and back without any (explicit) transformation in anything else than JSON.
Naturally, when you plug this transformation flow on reactive infrastructure provided by Play2.1, it suddenly opens new horizons.
This is the so-called (by me) JSON coast-to-coast design:
- Don’t consider JSON data chunk by chunk but as a continuous flow of data from client to DB (or else) through server,
- Treat the JSON flow like a pipe that you connect to others pipes while applying modifications, transformations alongside,
- Treat the flow in a fully asynchronous/non-blocking way.
This is also one of the reason of being of Play2.1 reactive architecture…
I believe considering your app through the prism of flows of data changes drastically the way you design your web apps in general. It may also open new functional scopes that fit today’s webapps requirements quite better than classic architecture. Anyway, this is not the subject here ;)
So, as you have deduced by yourself, to be able to manipulate Json flows based on validation and transformation directly, we needed some new tools. JSON combinators were good candidates but they are a bit too generic.
That’s why we have created some specialized combinators and API called JSON transformers to do that.
JSON transformers are Reads[T <: JsValue]
You may tell JSON transformers are just f:JSON => JSON
.
So a JSON transformer could be simply a Writes[A <: JsValue]
.
But, a JSON transformer is not only a function: as we said, we also want to validate JSON while transforming it.
As a consequence, a JSON transformer is a Reads[A <: Jsvalue]
.
Keep in mind that a Reads[A <: JsValue] is able to transform and not only to read/validate
Recent Play2 JSON syntax evolutions
As you know, Json API for Play2.1 was still draft and has evolved since I began writing article part 1/2.
We have changed a few things since (nothing conceptual, just cosmetics).
Syntax clarification
Reads[A] andThen Reads[B]
has been renamed Reads[A] andKeep Reads[B]
(keep the right side result)
Reads[A] provided Reads[B]
has been renamed Reads[A] keepAnd Reads[B]
(keep the left side result and is symmetric with andKeep)
Remarkable new Reads[A] features
Reads[A <: JsValue] andThen Reads[B]
andThen
has the classic Scala semantic of function composition : it applies Reads[A <: JsValue]
on JSON retrieving a JsValue and then applies Reads[B]
on this JsValue.
Reads[A <: JsValue].map(f: A => B): Reads[B]
map
is the classic and always very useful Scala map function.
Reads[A <: JsValue].flatMap(f: A => Reads[B]): Reads[B]
flatMap
is the classic Scala flatMap function.
JSON transformers case by case
A few reminders
JSON new syntax
In code samples, we’ll use the following JSON.
1 2 3 4 5 6 7 8 9 10 11 12 13 |
|
Remind that, in Play2, you can write this JSON as following.
1 2 3 4 5 6 7 8 9 10 11 12 13 |
|
Defining Play2 JSON action controller
Here is how you would write a Play2.1 action controller to receive and manipulate/validate JSON.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
|
Please note the
JsResult.map
andJsResult.recover
functions allowing to compose result and deal with errors.
Now let’s describe JSON transformers with samples
Case 1: Pick JSON value in JsPath
Pick value as JsValue
1 2 3 4 5 6 7 8 9 10 |
|
(__ \ 'key2 \ 'key23).json...
- All JSON transformers are in
JsPath.json.
(__ \ 'key2 \ 'key23).json.pick
pick
is aReads[JsValue]
which picks the value IN the given JsPath. Here["alpha","beta","gamma"]
JsSuccess(["alpha","beta","gamma"],/key2/key23)
- This is a simply successful
JsResult
- For info,
/key2/key23
represents the JsPath where data were read but don’t care about it, it’s mainly used by Play API to compose JsResult(s)) ["alpha","beta","gamma"]
is just due to the fact that we have overridentoString
To Remember
jsPath.json.pick
gets ONLY the value inside the JsPath
Pick value as Type
1 2 3 4 5 6 7 8 9 10 |
|
(__ \ 'key2 \ 'key23).json.pick[JsArray]
pick[T]
is aReads[T <: JsValue]
which picks the value (as aJsArray
in our case) IN the given JsPath
To Remember:
jsPath.json.pick[T <: JsValue]
extracts ONLY the typed value inside the JsPath
Case 2: Pick branch following JsPath
Pick branch as JsValue
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
|
(__ \ 'key2 \ 'key23).json.pickBranch
pickBranch
is aReads[JsValue]
which picks the branch from root to given JsPath
{"key2":{"key24":{"key242":"value242"}}}
- The result is the branch from root to given JsPath including the JsValue in JsPath
To Remember:
jsPath.json.pickBranch
extracts the single branch down to JsPath + the value inside JsPath
Case 3: Copy a value from input JsPath into a new JsPath
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
|
(__ \ 'key25 \ 'key251).json.copyFrom( reads: Reads[A <: JsValue] )
copyFrom
is aReads[JsValue]
copyFrom
reads the JsValue from input JSON using provided Reads[A]copyFrom
copies this extracted JsValue as the leaf of a new branch corresponding to given JsPath
{"key25":{"key251":123}}
copyFrom
reads value123
copyFrom
copies this value into new branch(__ \ 'key25 \ 'key251)
To Remember:
jsPath.json.copyFrom(Reads[A <: JsValue])
reads value from input JSON and creates a new branch with result as leaf
Case 4: Copy full input Json & update a branch
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 |
|
(__ \ 'key2).json.update(reads: Reads[A < JsValue])
- is a
Reads[JsObject]
(__ \ 'key2 \ 'key24).json.update(reads)
does 3 things:
- extracts value from input JSON at JsPath
(__ \ 'key2 \ 'key24)
- applies
reads
on this relative value and re-creates a branch(__ \ 'key2 \ 'key24)
adding result ofreads
as leaf - merges this branch with full input JSON replacing existing branch (so it works only with input JsObject and not other type of JsValue)
JsSuccess({…},)
- Just for info, there is no JsPath as 2nd parameter there because the JSON manipulation was done from Root JsPath
To Remember:
jsPath.json.update(Reads[A <: JsValue])
only works for JsObject, copies full inputJsObject
and updates jsPath with providedReads[A <: JsValue]
Case 5: Put a given value in a new branch
1 2 3 4 5 6 7 8 9 10 11 12 13 |
|
(__ \ 'key24 \ 'key241).json.put( a: => JsValue )
- is a Reads[JsObject]
(__ \ 'key24 \ 'key241).json.put( a: => JsValue )
- creates a new branch
(__ \ 'key24 \ 'key241)
- puts
a
as leaf of this branch.
jsPath.json.put( a: => JsValue )
- takes a JsValue argument passed by name allowing to pass even a closure to it.
jsPath.json.put
- doesn’t care at all about input JSON
- simply replace input JSON by given value
To Remember:
jsPath.json.put( a: => Jsvalue )
creates a new branch with a given value without taking into account input JSON
Case 6: Prune a branch from input JSON
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
|
(__ \ 'key2 \ 'key22).json.prune
- is a
Reads[JsObject]
that works only with JsObject
(__ \ 'key2 \ 'key22).json.prune
- removes given JsPath from input JSON (
key22
has disappeared underkey2
)
Please note the resulting JsObject hasn’t same keys order as input JsObject. This is due to the implementation of JsObject and to the merge mechanism. But this is not important since we have overriden JsObject.equals
method to take this into account.
To Remember:
jsPath.json.prune
only works with JsObject and removes given JsPath form input JSON) Please note that:
prune
doesn’t work for recursive JsPath for the time being- if
prune
doesn’t find any branch to delete, it doesn’t generate any error and returns unchanged JSON.
More complicated cases
Case 7: Pick a branch and update its content in 2 places
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
|
(__ \ 'key2).json.pickBranch(reads: Reads[A <: JsValue])
- extracts branch
__ \ 'key2
from input JSON and appliesreads
to the relative leaf of this branch (only to the content)
(__ \ 'key21).json.update(reads: Reads[A <: JsValue])
- updates
(__ \ 'key21)
branch
of[JsNumber]
- is just a
Reads[JsNumber]
- extracts a JsNumber from
(__ \ 'key21)
of[JsNumber].map{ case JsNumber(nb) => JsNumber(nb + 10) }
- reads a JsNumber (value 123 in
__ \ 'key21
) - uses
Reads[A].map
to increase it by 10 (in immutable way naturally)
andThen
- is just the composition of 2
Reads[A]
- first reads is applied and then result is piped to second reads
of[JsArray].map{ case JsArray(arr) => JsArray(arr :+ JsString("delta")
- reads a JsArray (value [alpha, beta, gamma] in
__ \ 'key23
) - uses
Reads[A].map
to appendJsString("delta")
to it
Please note the result is just the
__ \ 'key2
branch since we picked only this branch
Case 8: Pick a branch and prune a sub-branch
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
|
(__ \ 'key2).json.pickBranch(reads: Reads[A <: JsValue])
- extracts branch
__ \ 'key2
from input JSON and appliesreads
to the relative leaf of this branch (only to the content)
(__ \ 'key23).json.prune
- removes branch
__ \ 'key23
from relative JSON
Please remark the result is just the
__ \ 'key2
branch withoutkey23
field.
What about combinators?
I stop there before it becomes boring (if not yet)…
Just keep in mind that you have now a huge toolkit to create generic JSON transformers.
You can compose, map, flatmap transformers together into other transformers. So possibilities are almost infinite.
But there is a final point to treat: mixing those great new JSON transformers with previously presented Reads combinators.
This is quite trivial as JSON transformers are just Reads[A <: JsValue]
Let’s demonstrate by writing a Gizmo to Gremlin JSON transformer.
Here is Gizmo:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
|
Here is Gremlin:
1 2 3 4 5 6 7 8 9 10 11 |
|
Ok let’s write a JSON transformer to do this transformation
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 |
|
Here we are ;)
I’m not going to explain all of this because you should be able to understand now.
Just remark:
(__ \ 'features).json.put(…)
is after (__ \ 'size).json.update
so that it overwrites original (__ \ 'features)
(Reads[JsObject] and Reads[JsObject]) reduce
- It merges results of both
Reads[JsObject]
(JsObject ++ JsObject) - It also applies the same JSON to both
Reads[JsObject]
unlikeandThen
which injects the result of the first reads into second one.
Conclusion
After 3 long articles, I think we have done a full 360° around new features brought by Play2.1 JSON API.
I hope you glimpse the whole new world of possiblities it can bring to us.
Personnally, I’ve begun using it in projects and I don’t yet see the limits of it and how I’ll end using it.
The only thing I can say is that it has changed my way of manipulating JSON data flows in general.
Next article coming soon: an applied example of a webapp following Json coast-to-coast design with ReactiveMongo
Have fun ;););)