The code & sample apps can be found on Github

After 5 months studying theories deeper & deeper on my free-time and preparing 3 talks for scala.io & ping-conf with my friend Julien Tournay aka @skaalf, I’m back blogging and I’ve got a few more ideas of articles to come…

If you’re interested in those talks, you can find pingconf videos here:

Let’s go back to our today’s subject : Incoming Play2.3/Scala generic validation API & more.

Julien Tournay aka @skaalf has been working a lot for a few months developing this new API and has just published an article previewing Play 2.3 generic validation API.

This new API is just the logical extension of play2/Scala Json API (that I’ve been working & promoting those 2 last years) pushing its principles far further by allowing validation on any data types.

This new API is a real step further as it will progressively propose a common API for all validations in Play2/Scala (Form/Json/XML/…). It proposes an even more robust design relying on very strong theoretical ground making it very reliable & typesafe.

Julien has written his article presenting the new API basics and he also found time to write great documentation for this new validation API. I must confess Json API doc was quite messy but I’ve never found freetime (and courage) to do better. So I’m not going to spend time on basic features of this new API and I’m going to target advanced features to open your minds about the power of this new API.

Let’s have fun with this new APi & Shapeless, this fantastic tool for higher-rank polymorphism & type-safety!


Warm-up with Higher-kind Zipping of Rules

A really cool & new feature of Play2.3 generic validation API is its ability to compose validation Rules in chains like:

1
2
3
4
val rule1: Rule[A, B] = ...
val rule2: Rule[B, C] = ...

val rule3: Rule[A, C] = rule1 compose rule2

In Play2.1 Json API, you couldn’t do that (you could only map on Reads).

Moreover, with new validation API, as in Json API, you can use macros to create basic validators from case-classes.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
case class FooBar(foo: String, bar: Int, foo2: Long)

val rule = Rule.gen[JsValue, FooBar]

/** Action to validate Json:
  * { foo: "toto", bar: 5, foo2: 2 }
  */
def action = Action(parse.json) { request =>
  rule.validate(request.body) map { foobar =>
    Ok(foobar.toString)
  } recoverTotal { errors =>
    BadRequest(errors.toString)
  }
}

Great but sometimes not enough as you would like to add custom validations on your class. For example, you want to verify :

  • foo isn’t empty
  • bar is >5
  • foo2 is <10

For that you can’t use the macro and must write your caseclass Rule yourself.

1
2
3
4
5
6
7
8
9
10
11
12
case class FooBar(foo: String, bar: Int, foo2: Long)

import play.api.data.mapping.json.Rules
import Rules._

val rule = From[JsValue] { __ =>
  (
    (__ \ "foo").read[String](notEmpty) ~
    (__ \ "bar").read[Int](min(5)) ~
    (__ \ "foo2").read[Long](max(10))
  )(FooBar.apply _)
}

Please note the new From[JsValue]: if it were Xml, it would be From[Xml], genericity requires some more info.

Ok that’s not too hard but sometimes you would like to use first the macro and after those primary type validations, you want to refine with custom validations. Something like:

1
2
Rule.gen[JsValue, FooBar] +?+?+ ( (notEmpty:Rule[String, String]) +: (min(5):Rule[Int, Int]) +: (min(10L):Rule[Long,Long]) )
// +?+?+ is a non-existing operator meaning "compose"

As you may know, you can’t do use this +: from Scala Sequence[T] as this list of Rules is typed heterogenously and Rule[I, O] is invariant.

So we are going to use Shapeless heterogenous Hlist for that:

1
2
3
4
5
6
val customRules =
  (notEmpty:Rule[String, String]) ::
  (min(5):Rule[Int, Int]) ::
  (min(10L):Rule[Long,Long]) ::
  HNil
// customRules is inferred Rule[String, String]) :: Rule[Int, Int] :: Rule[Long,Long]


How to compose Rule[JsValue, FooBar] with Rule[String, String]) :: Rule[Int, Int] :: Rule[Long,Long] ?


We need to convert Rule[JsValue, FooBar] to something like Rule[JsValue, T <: HList].

Based on Shapeless Generic[T], we can provide a nice little new conversion API .hlisted:

1
val rule: Rule[JsValue, String :: Int :: Long :: HNil] = Rule.gen[JsValue, FooBar].hlisted

Generic[T] is able to convert any caseclass from Scala from/to Shapeless HList (& CoProduct).

So we can validate a case class with the macro and get a Rule[JsValue, T <: HList] from it.



How to compose Rule[JsValue, String :: Int :: Long :: HNil] with Rule[String, String]) :: Rule[Int, Int] :: Rule[Long,Long]?


Again, using Shapeless Polymorphic and HList RightFolder, we can implement a function :

1
2
Rule[String, String]) :: Rule[Int, Int] :: Rule[Long,Long] :: HNil) =>
    Rule[String :: Int :: Long :: HNil, String :: Int :: Long :: HNil]

This looks like some higher-kind zip function, let’s call it HZIP.



Now, we can compose them…


1
2
3
4
5
6
val ruleZip = Rule.gen[JsValue, FooBar].hlisted compose hzip(
  (notEmpty:Rule[String, String]) ::
  (min(5):Rule[Int, Int]) ::
  (min(10L):Rule[Long,Long]) ::
  HNil
)

Finally, let’s wire all together in a Play action:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
def hzipper = Action(parse.json) { request =>
  ruleZip.validate(request.body) map { foobar =>
    Ok(foobar.toString)
  } recoverTotal { errors =>
    BadRequest(errors.toString)
  }
}

// OK case
{
  "foo" : "toto",
  "bar" : 5,
  "foo2" : 5
} => toto :: 5 :: 8 :: HNil

// KO case
{
  "foo" : "",
  "bar" : 2,
  "foo2" : 12
} => Failure(List(
  ([0],List(ValidationError(error.max,WrappedArray(10)))),
  ([1],List(ValidationError(error.min,WrappedArray(5)))),
  ([2],List(ValidationError(error.required,WrappedArray())))
))

As you can see, the problem in this approach is that we lose the path of Json. Anyway, this can give you a few ideas! Now let’s do something really useful…



Higher-kind Fold of Rules to break the 22 limits

As in Play2.1 Json API, the new validation API provides an applicative builder which allows the following:

1
2
3
(Rule[I, A] ~ Rule[I, B] ~ Rule[I, C]).tupled => Rule[I, (A, B, C)]

(Rule[I, A] ~ Rule[I, B] ~ Rule[I, C])(MyClass.apply _) => Rule[I, MyClass]

But, in Play2.1 Json API and also in new validation API, all functional combinators are limited by the famous Scala 22 limits.

In Scala, you CAN’T write :

  • a case-class with >22 fields
  • a Tuple23

So you can’t do Rule[JsValue, A] ~ Rule[JsValue, B] ~ ... more than 22 times.

Nevertheless, sometimes you receive huge JSON with much more than 22 fields in it. Then you have to build more complex models like case-classes embedding case-classes… Shameful, isn’t it…

Let’s be shameless with Shapeless HList which enables to have unlimited heterogenously typed lists!

So, with HList, we can write :

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
val bigRule =
  (__  \ "foo1").read[String] ::
  (__  \ "foo2").read[String] ::
  (__  \ "foo3").read[String] ::
  (__  \ "foo4").read[String] ::
  (__  \ "foo5").read[String] ::
  (__  \ "foo6").read[String] ::
  (__  \ "foo7").read[String] ::
  (__  \ "foo8").read[String] ::
  (__  \ "foo9").read[String] ::
  (__  \ "foo10").read[Int] ::
  (__  \ "foo11").read[Int] ::
  (__  \ "foo12").read[Int] ::
  (__  \ "foo13").read[Int] ::
  (__  \ "foo14").read[Int] ::
  (__  \ "foo15").read[Int] ::
  (__  \ "foo16").read[Int] ::
  (__  \ "foo17").read[Int] ::
  (__  \ "foo18").read[Int] ::
  (__  \ "foo19").read[Int] ::
  (__  \ "foo20").read[Boolean] ::
  (__  \ "foo21").read[Boolean] ::
  (__  \ "foo22").read[Boolean] ::
  (__  \ "foo23").read[Boolean] ::
  (__  \ "foo25").read[Boolean] ::
  (__  \ "foo26").read[Boolean] ::
  (__  \ "foo27").read[Boolean] ::
  (__  \ "foo28").read[Boolean] ::
  (__  \ "foo29").read[Boolean] ::
  (__  \ "foo30").read[Float] ::
  (__  \ "foo31").read[Float] ::
  (__  \ "foo32").read[Float] ::
  (__  \ "foo33").read[Float] ::
  (__  \ "foo34").read[Float] ::
  (__  \ "foo35").read[Float] ::
  (__  \ "foo36").read[Float] ::
  (__  \ "foo37").read[Float] ::
  (__  \ "foo38").read[Float] ::
  (__  \ "foo39").read[Float] ::
  (__  \ "foo40").read[List[Long]] ::
  (__  \ "foo41").read[List[Long]] ::
  (__  \ "foo42").read[List[Long]] ::
  (__  \ "foo43").read[List[Long]] ::
  (__  \ "foo44").read[List[Long]] ::
  (__  \ "foo45").read[List[Long]] ::
  (__  \ "foo46").read[List[Long]] ::
  (__  \ "foo47").read[List[Long]] ::
  (__  \ "foo48").read[List[Long]] ::
  (__  \ "foo49").read[List[Long]] ::
  (__  \ "foo50").read[JsNull.type] ::
  HNil

// inferred as Rule[JsValue, String] :: Rule[JsValue, String] :: ... :: Rule[JsValue, List[Long]] :: HNil

That’s cool but we want the :: operator to have the same applicative builder behavior as the~/and` operator:

1
2
Rule[JsValue, String] :: Rule[JsValue, Long] :: Rule[JsValue, Float] :: HNil =>
  Rule[JsValue, String :: Long :: Float :: HNil]

This looks like a higher-kind fold so let’s call that HFOLD.

We can build this hfold using Shapeless polymorphic functions & RighFolder.

In a next article, I may write about coding such shapeless feature. Meanwhile, you’ll have to discover the code on Github as it’s a bit hairy but very interesting ;)

Gathering everything, we obtain the following:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
/* Rules Folding */
val ruleFold = From[JsValue]{ __ =>
  hfold[JsValue](
    (__  \ "foo1").read[String] ::
    (__  \ "foo2").read[String] ::
    (__  \ "foo3").read[String] ::
    (__  \ "foo4").read[String] ::
    (__  \ "foo5").read[String] ::
    (__  \ "foo6").read[String] ::
    (__  \ "foo7").read[String] ::
    (__  \ "foo8").read[String] ::
    (__  \ "foo9").read[String] ::
    (__  \ "foo10").read[Int] ::
    (__  \ "foo11").read[Int] ::
    (__  \ "foo12").read[Int] ::
    (__  \ "foo13").read[Int] ::
    (__  \ "foo14").read[Int] ::
    (__  \ "foo15").read[Int] ::
    (__  \ "foo16").read[Int] ::
    (__  \ "foo17").read[Int] ::
    (__  \ "foo18").read[Int] ::
    (__  \ "foo19").read[Int] ::
    (__  \ "foo20").read[Boolean] ::
    (__  \ "foo21").read[Boolean] ::
    (__  \ "foo22").read[Boolean] ::
    (__  \ "foo23").read[Boolean] ::
    (__  \ "foo25").read[Boolean] ::
    (__  \ "foo26").read[Boolean] ::
    (__  \ "foo27").read[Boolean] ::
    (__  \ "foo28").read[Boolean] ::
    (__  \ "foo29").read[Boolean] ::
    (__  \ "foo30").read[Float] ::
    (__  \ "foo31").read[Float] ::
    (__  \ "foo32").read[Float] ::
    (__  \ "foo33").read[Float] ::
    (__  \ "foo34").read[Float] ::
    (__  \ "foo35").read[Float] ::
    (__  \ "foo36").read[Float] ::
    (__  \ "foo37").read[Float] ::
    (__  \ "foo38").read[Float] ::
    (__  \ "foo39").read[Float] ::
    (__  \ "foo40").read[List[Long]] ::
    (__  \ "foo41").read[List[Long]] ::
    (__  \ "foo42").read[List[Long]] ::
    (__  \ "foo43").read[List[Long]] ::
    (__  \ "foo44").read[List[Long]] ::
    (__  \ "foo45").read[List[Long]] ::
    (__  \ "foo46").read[List[Long]] ::
    (__  \ "foo47").read[List[Long]] ::
    (__  \ "foo48").read[List[Long]] ::
    (__  \ "foo49").read[List[Long]] ::
    (__  \ "foo50").read[JsNull.type] ::
    HNil
  )
}

Let’s write a play action using this rule:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
def hfolder = Action(parse.json) { request =>
  ruleFold.validate(request.body) map { hl =>
    Ok(hl.toString)
  } recoverTotal { errors =>
    BadRequest(errors.toString)
  }
}

// OK
{
  "foo1" : "toto1",
  "foo2" : "toto2",
  "foo3" : "toto3",
  "foo4" : "toto4",
  "foo5" : "toto5",
  "foo6" : "toto6",
  "foo7" : "toto7",
  "foo8" : "toto8",
  "foo9" : "toto9",
  "foo10" : 10,
  "foo11" : 11,
  "foo12" : 12,
  "foo13" : 13,
  "foo14" : 14,
  "foo15" : 15,
  "foo16" : 16,
  "foo17" : 17,
  "foo18" : 18,
  "foo19" : 19,
  "foo20" : true,
  "foo21" : false,
  "foo22" : true,
  "foo23" : false,
  "foo24" : true,
  "foo25" : false,
  "foo26" : true,
  "foo27" : false,
  "foo28" : true,
  "foo29" : false,
  "foo30" : 3.0,
  "foo31" : 3.1,
  "foo32" : 3.2,
  "foo33" : 3.3,
  "foo34" : 3.4,
  "foo35" : 3.5,
  "foo36" : 3.6,
  "foo37" : 3.7,
  "foo38" : 3.8,
  "foo39" : 3.9,
  "foo40" : [1,2,3],
  "foo41" : [11,21,31],
  "foo42" : [12,22,32],
  "foo43" : [13,23,33],
  "foo44" : [14,24,34],
  "foo45" : [15,25,35],
  "foo46" : [16,26,36],
  "foo47" : [17,27,37],
  "foo48" : [18,28,38],
  "foo49" : [19,29,39],
  "foo50" : null
} => toto1 :: toto2 :: toto3 :: toto4 :: toto5 :: toto6 :: toto7 :: toto8 :: toto9 ::
  10 :: 11 :: 12 :: 13 :: 14 :: 15 :: 16 :: 17 :: 18 :: 19 ::
  true :: false :: true :: false :: false :: true :: false :: true :: false ::
  3.0 :: 3.1 :: 3.2 :: 3.3 :: 3.4 :: 3.5 :: 3.6 :: 3.7 :: 3.8 :: 3.9 ::
  List(1, 2, 3) :: List(11, 21, 31) :: List(12, 22, 32) :: List(13, 23, 33) ::
  List(14, 24, 34) :: List(15, 25, 35) :: List(16, 26, 36) :: List(17, 27, 37) ::
  List(18, 28, 38) :: List(19, 29, 39) :: null ::
  HNil


// KO
{
  "foo1" : "toto1",
  "foo2" : "toto2",
  "foo3" : "toto3",
  "foo4" : "toto4",
  "foo5" : "toto5",
  "foo6" : "toto6",
  "foo7" : 50,
  "foo8" : "toto8",
  "foo9" : "toto9",
  "foo10" : 10,
  "foo11" : 11,
  "foo12" : 12,
  "foo13" : 13,
  "foo14" : 14,
  "foo15" : true,
  "foo16" : 16,
  "foo17" : 17,
  "foo18" : 18,
  "foo19" : 19,
  "foo20" : true,
  "foo21" : false,
  "foo22" : true,
  "foo23" : false,
  "foo24" : true,
  "foo25" : false,
  "foo26" : true,
  "foo27" : "chboing",
  "foo28" : true,
  "foo29" : false,
  "foo30" : 3.0,
  "foo31" : 3.1,
  "foo32" : 3.2,
  "foo33" : 3.3,
  "foo34" : 3.4,
  "foo35" : 3.5,
  "foo36" : 3.6,
  "foo37" : 3.7,
  "foo38" : 3.8,
  "foo39" : 3.9,
  "foo40" : [1,2,3],
  "foo41" : [11,21,31],
  "foo42" : [12,22,32],
  "foo43" : [13,23,33],
  "foo44" : [14,24,34],
  "foo45" : [15,25,35],
  "foo46" : [16,26,"blabla"],
  "foo47" : [17,27,37],
  "foo48" : [18,28,38],
  "foo49" : [19,29,39],
  "foo50" : "toto"
} => Failure(List(
  (/foo50,List(ValidationError(error.invalid,WrappedArray(null)))),
  (/foo46[2],List(ValidationError(error.number,WrappedArray(Long)))),
  (/foo27,List(ValidationError(error.invalid,WrappedArray(Boolean)))),
  (/foo15,List(ValidationError(error.number,WrappedArray(Int)))),
  (/foo7,List(ValidationError(error.invalid,WrappedArray(String))
))))

Awesome… now, nobody can say 22 limits is still a problem ;)

Have a look at the code on Github.

Have fun x 50!





The code & sample apps can be found on Github here


Actor-Room makes it easy to:
  • create any group of connected entities (people or not) (chatroom, forum, broadcast pivot…).
  • manage connections, disconnections, broadcast, targetted message through actor and nothing else.
For now, members can be:
  • websocket endpoints through actors without taking care of Iteratees/Enumerators…
  • Bots to simulate members

Reminders on websockets in Play

Here is the function Play provides to create a websocket:

1
2
3
def async[A](
  f: RequestHeader => Future[(Iteratee[A, _], Enumerator[A])]
)(implicit frameFormatter: FrameFormatter[A]): WebSocket[A]

A websocket is a persistent bi-directional channel of communication (in/out) and is created with:

  • an Iteratee[A, _] to manage all frames received by the websocket endpoint
  • an Enumerator[A] to send messages through the websocket
  • an implicit FrameFormatter[A] to parse frame content to type A (Play provides default FrameFormatter for String or JsValue)

Here is how you traditionally create a websocket endpoint in Play:

1
2
3
4
5
6
7
8
9
10
object MyController extends Controller {
    def connect = Websocket.async[JsValue]{ rh =>
        // the iteratee to manage received messages
        val iteratee = Iteratee.foreach[JsValue]( js => ...)

        // the enumerator to be able to send messages
        val enumerator = // generally a PushEnumerator
        (iteratee, enumerator)
    }
}

Generally, the Enumerator[A] is created using Concurrent.broadcast[A] and Concurrent.unicast[A] which are very powerful tools but not so easy to understand exactly (the edge-cases of connection close, errors are always tricky).

You often want to:

  • manage multiple client connections at the same time
  • parse messages received from websockets,
  • do something with the message payload
  • send messages to a given client
  • broadcast messages to all connected members
  • create bots to be able to simulate fake connected members
  • etc…

To do that in Play non-blocking/async architecture, you often end developing an Actor topology managing all events/messages on top of the previous Iteratee/Enumerator.

The Iteratee/Enumerator is quite generic but always not so easy to write.

The actor topology is quite generic because there are administration messages that are almost always the same:

  • Connection/Forbidden/Disconnection
  • Broadcast/Send

Actor Room is a helper managing all of this for you. So you can just focus on message management using actors and nothing else. It provides all default behaviors and all behaviors can be overriden if needed. It exposes only actors and nothing else.


The code is based on the chatroom sample (and a cool sample by Julien Tournay) from Play Framework pushed far further and in a more generic way.



What is Actor Room?

An actor room manages a group of connected members which are supervised by a supervisor

Member = 2 actors (receiver/sender)

Each member is represented by 2 actors (1 receiver & 1 sender):

  • You MUST create at least a Receiver Actor because it’s your job to manage your own message format

  • The Sender Actor has a default implementation but you can override it.


Supervisor = 1 actor

All actors are managed by 1 supervisor which have two roles:

  • Creates/supervises all receiver/sender actors

  • Manages administration messages (routing, forwarding, broadcasting etc…)



Code sample step by step

Create the Actor Room

1
2
3
4
5
6
// default constructor
  val room = Room()

  // constructor with custom supervisor
  // custom supervisor are described later
  val room = Room(Props(classOf[CustomSupervisor]))

The room creates the Supervisor actor for you and delegates the creation of receiver/sender actors to it.

If you want to broadcast a message or target a precise member, you should use the supervisor.

1
2
room.supervisor ! Broadcast("fromId", Json.obj("foo" -> "bar"))
  room.supervisor ! Send("fromId", "toId", Json.obj("foo" -> "bar"))

You can manage several rooms in the same project.


Create the mandatory Receiver Actor

There is only one message to manage:

1
2
3
4
5
/** Message received and parsed to type A
  * @param from the ID of the sender
  * @param payload the content of the message
  */
case class Received[A](from: String, payload: A) extends Message

If your websocket frames contain Json, then it should be Received[JsValue].

You just have to create a simple actor:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
// Create an actor to receive messages from websocket
class Receiver extends Actor {
  def receive = {
    // Received(fromId, js) is the only Message to manage in receiver
    case Received(from, js: JsValue) =>
      (js \ "msg").asOpt[String] match {
        case None =>
          play.Logger.error("couldn't msg in websocket event")

        case Some(s) =>
          play.Logger.info(s"received $s")
          // broadcast message to all connected members
          context.parent ! Broadcast(from, Json.obj("msg" -> s))
      }
  }
}

Please note the Receiver Actor is supervised by the Supervisor actor. So, within the Receiver Actor, context.parent is the Supervisor and you can use it to send/broadcast message as following:

1
2
3
4
5
6
7
8
9
context.parent ! Send(fromId, toId, mymessage)
context.parent ! Broadcast(fromId, mymessage)

// The 2 messages
/** Sends a message from a member to another member */
case class   Send[A](from: String, to: String, payload: A) extends Message

/** Broadcasts a message from a member */
case class   Broadcast[A](from: String, payload: A) extends Message

Create your Json websocket endpoint

Please note that each member is identified by a string that you define yourself.

import org.mandubian.actorroom._

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
class Receiver extends Actor {
  def receive = {
    ...
  }
}

object Application extends Controller {
  val room = Room()

  /** websocket requires :
    * - the type of the Receiver actor
    * - the type of the payload
    */
  def connect(id: String) = room.websocket[Receiver, JsValue](id)

  // or
  def connect(id: String) = room.websocket[JsValue](id, Props[Receiver])

}

All together

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
import akka.actor._

import play.api._
import play.api.mvc._
import play.api.libs.json._

// Implicits
import play.api.Play.current
import play.api.libs.concurrent.Execution.Implicits._

import org.mandubian.actorroom._

class Receiver extends Actor {
  def receive = {
    case Received(from, js: JsValue) =>
      (js \ "msg").asOpt[String] match {
        case None => play.Logger.error("couldn't msg in websocket event")
        case Some(s) =>
          play.Logger.info(s"received $s")
          context.parent ! Broadcast(from, Json.obj("msg" -> s))
      }
  }
}

object Application extends Controller {

  val room = Room()

  def websocket(id: String) = room.websocket[Receiver, JsValue](id)

}


Extend default behaviors

Override the administration message format

AdminMsgFormatter typeclass is used by ActorRoom to format administration messages (Connected, Disconnected and Error) by default.

AdminMsgFormatter[JsValue] and AdminMsgFormatter[String] are provided by default.

You can override the format as following:

1
2
3
4
5
6
7
8
9
// put this implicit in the same scope where you create your websocket endpoint
implicit val msgFormatter = new AdminMsgFormatter[JsValue]{
    def connected(id: String) = Json.obj("kind" -> "connected", "id" -> id)
    def disconnected(id: String) = Json.obj("kind" -> "disconnected", "id" -> id)
    def error(id: String, msg: String) = Json.obj("kind" -> "error", "id" -> id, "msg" -> msg)
}

// then this msgFormatter will be used for all administration messages  
def websocket(id: String) = room.websocket[Receiver, JsValue](id)

Override the Sender Actor

You just have to create a new actor as following:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
class MyCustomSender extends Actor {

  def receive = {
    case s: Send[JsValue]        => // message send from a member to another one

    case b: Broadcast[JsValue]   => // message broadcast by a member

    case Connected(id)           => // member "id" has connected

    case Disconnected(id)        => // member "id" has disconnected

    case Init(id, receiverActor) => // Message sent when sender actor is initialized by ActorRoom

  }

}

Then you must initialize your websocket with it

1
def connect(id: String) = room.websocket[JsValue](_ => id, Props[Receiver], Props[MyCustomSender])

You can override the following messages:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
// public sender messages
/** Sender actor is initialized by Supervisor */
case class   Init(id: String, receiverActor: ActorRef)

/** Sends a message from a member to another member */
case class   Send[A](from: String, to: String, payload: A) extends Message

/** Broadcasts a message from a member */
case class   Broadcast[A](from: String, payload: A) extends Message

/** member with ID has connected */
case class   Connected(id: String) extends Message

/** member with ID has disconnected */
case class   Disconnected(id: String) extends Message

Override the Supervisor Actor

Please note Supervisor is an actor which manages a internal state containing all members:

1
var members = Map.empty[String, Member]

You can override the default Supervisor as following:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
class CustomSupervisor extends Supervisor {

    def customBroadcast: Receive = {
      case Broadcast(from, js: JsObject) =>
        // adds members to all messages
        val ids = Json.obj("members" -> members.map(_._1))

        members.foreach {
          case (id, member) =>
            member.sender ! Broadcast(from, js ++ ids)

          case _ => ()
        }
    }

    override def receive = customBroadcast orElse super.receive
  }

Create a bot to simulate member

A bot is a fake member that you can use to communicate with other members. It’s identified by an ID as any member.

You create a bot with these API:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
case class Member(id: String, val receiver: ActorRef, val sender: ActorRef) extends Message

def bot[Payload](id: String)
    (implicit msgFormatter: AdminMsgFormatter[Payload]): Future[Member]

def bot[Payload](
    id: String,
    senderProps: Props
  )(implicit msgFormatter: AdminMsgFormatter[Payload]): Future[Member]


def bot[Payload](
    id: String,
    receiverProps: Props,
    senderProps: Props): Future[Member]

Then with returned Member, you can simulate messages:

1
2
3
4
5
6
val room = Room()

val bot = room.bot[JsValue]("robot")

// simulate a received message
bot.receiver ! Received(bod.id, Json.obj("foo" -> "bar"))

Naturally, you can override the Bot Sender Actor

1
2
3
4
5
6
7
8
9
10
11
12
/** The default actor sender for Bots */
class BotSender extends Actor {

  def receive = {
    case s =>
      play.Logger.info(s"Bot should have sent ${s}")

  }

}

val bot = room.bot[JsValue]("robot", Props[BotSender])

So what else??? Everything you can override and everything that I didn’t implement yet…

On github project, you will find 2 samples:

  • simplest which is a very simple working sample.
  • websocket-chat which is just the Play Framework ChatRoom sample rewritten with ActorRoom.

Have fun!





The code for all autosources & sample apps can be found on Github here

The aim of this article is to show how scalaz-stream could be plugged on existing Play Iteratee/Enumerator and used in your web projects. I also wanted to evaluate in depth the power of scalaz-stream Processes by trying to write a recursive streaming action: I mean a web endpoint streaming data and re-injecting its own streamed data in itself.


If you want to see now how scalaz-stream is used with Play, go to this paragraph directly.


Why Scalaz-Stream when you have Play Iteratees?


Play Iteratees are powerful & cool but…

I’m a fan of everything dealing with data streaming and realtime management in backends. I’ve worked a lot on Play Framework and naturally I’ve been using the cornerstone behind Play’s reactive nature: Play Iteratees.

Iteratees (with its counterparts, Enumerators and Enumeratees) are great to manipulate/transform linear streams of data chunks in a very reactive (non-blocking & asynchronous) and purely functional way:

  • Enumerators identifies the data producer that can generate finite/infinite/procedural data streams.
  • Iteratee is simply a data folder built as a state machine based on 3 states (Continue, Done, Error) which consumes data from Enumerator to compute a final result.
  • Enumeratee is a kind of transducer able to adapt an Enumerator producing some type of data to an Iteratee that expects other type of data. Enumeratee can be used as both a pipe transformer and adapter.

Iteratee is really powerful but I must say I’ve always found them quite picky to use, practically speaking. In Play, they are used in their best use-case and they were created for that exactly. I’ve been using Iteratees for more than one year now but I still don’t feel fluent with them. Each time I use them, I must spend some time to know how I could write what I need. It’s not because they are purely functional (piping an Enumerator into an Enumeratee into an Iteratee is quite trivial) but there is something that my brain doesn’t want to catch.

If you want more details about my experience with Iteratees, go to this paragraph

That’s why I wanted to work with other functional streaming tools to see if they suffer the same kind of usability toughness or can bring something more natural to me. There are lots of other competitors on the field such as pipes, conduits and machines. As I don’t have physical time to study all of them in depth, I’ve chosen the one that appealed me the most i.e. Machines.

I’m not yet a Haskell coder even if I can mumble it so I preferred to evaluate the concept with scalaz-stream, a Scala implementation trying to bring machines to normal coders focusing on the aspect of IO streaming.



Scratching the concepts of Machine / Process ?

I’m not going to judge if Machines are better or not than Iteratees, this is not my aim. I’m just experimenting the concept in an objective way.

I won’t explain the concept of Machines in depth because it’s huge and I don’t think I have the theoretical background to do it right now. So, let’s focus on very basic ideas at first:

  • Machine is a very generic concept that represents a data processing mechanism with potential multiple inputs, an output and monadic effects (typically Future input chunks, side-effects while transforming, delayed output…)
  • To simplify, let say a machine is a bit like a mechano that you construct by plugging together other more generic machines (such as source, transducer, sink, tee, wye) as simply as pipes.
  • Building a machine also means planning all the steps you will go through when managing streamed data but it doesn’t do anything until you run it (no side-effect, no resource consumption). You can re-run a machine as many times as you want.
  • A machine is a state machine (Emit/Await/Halt) as Iteratee but it manages error in a more explicit way IMHO (fallback/error)

In scalaz-stream, you don’t manipulate machines which are too abstract for real-life use-cases but you manipulate simpler concepts:

  • Process[M, O] is a restricted machine outputting a stream of O. It can be a source if the monadic effect gets input from I/O or generates procedural data, or a sink if you don’t care about the output. Please note that it doesn’t infer the type of potential input at all.
  • Wye[L, R, O] is a machine that takes 2 inputs (left L / right R) and outputs chunks of type O (you can read from left or right or wait for both before ouputting)
  • Tee[L, R, O] is a Wye that can only read alternatively from left or from right but not from both at the same time.
  • Process1[I, O] can be seen as a transducer which accepts inputs of type I and outputs chunks of type O (a bit like Enumeratee)
  • Channel[M, I, O] is an effectul channel that accepts input of type I and use it in a monadic effect M to produce potential O

What I find attractive in Machines?

  • Machines is producer/consumer/transducer in the same place and Machines can consume/fold as Iteratee, transform as Enumeratee and emit as Enumerator at the same time and it opens lots of possibilities (even if 3 concepts in one could make it more complicated too).
  • I feel like playing with legos as you plug machines on machines and this is quite funny actually.
  • Machines manages monadic effects in its design and doesn’t infer the type of effect so you can use it with I/O, Future and whatever you can imagine that is monadic…
  • Machines provide out-of-the-box Tee/Wye to compose streams, interleave, zip them as you want without writing crazy code.
  • The early code samples I’ve seen were quite easy to read (even the implementation is not so complex). Have a look at the StartHere sample provided by scalaz-stream:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
property("simple file I/O") = secure {

    val converter: Task[Unit] =
      io.linesR("testdata/fahrenheit.txt")
        .filter(s => !s.trim.isEmpty && !s.startsWith("//"))
        .map(line => fahrenheitToCelsius(line.toDouble).toString)
        .intersperse("\n")
        .pipe(process1.utf8Encode)
        .to(io.fileChunkW("testdata/celsius.txt"))
        .run

    converter.run
    true
  }

But don’t think everything is so simple, machines is a complex concept with lots of theory behind it which is quite abstract. what I find very interesting is that it’s possible to vulgarize this very abstract concept with simpler concepts such as Process, Source, Sink, Tee, Wye… that you can catch quite easily as these are concepts you already manipulated when you were playing in your bathtub when you were child (or even now).



Scalaz-stream Plug’n’Play Iteratee/Enumerator

After these considerations, I wanted to experiment scalaz-stream with Play streaming capabilities in order to see how it behaves in a context I know.

Here is what I decided to study:

  • Stream data out of a controller action using a scalaz-stream Process
  • Call an AsyncWebService & consume the response as a stream of Array[Byte] using a scalaz-stream Process

Here is existing Play API :

  • Action provides Ok.stream(Enumerator)
  • WS call consuming response as a stream of data WS.get(r: ResponseHeader => Iteratee)

As you can see, these API depends on Iteratee/Enumerator. As I didn’t want to hack Play too much as a beginning, I decided to try & plug scalaz-stream on Play Iteratee (if possible).

Building Enumerator[O] from Process[Task, O]

The idea is to take a scalaz-stream Source[O] (Process[M,O]) and wrap it into an Enumerator[O] so that it can be used in Play controller actions.

An Enumerator is a data producer which can generate those data using monadic Future effects (Play Iteratee is tightly linked to Future).

Process[Task, O] is a machine outputting a stream of O so it’s logically the right candidate to be adapted with a Enumerator[O]. Let’s remind’ Task is just a scalaz Future[Either[Throwable,A]] with a few helpers and it’s used in scalaz-stream.

So I’ve implemented (at least tried) an Enumerator[O] that accepts a Process[Task, O]:

1
2
3
4
5
6
def enumerator[O](p: Process[Task, O])(implicit ctx: ExecutionContext) =
    new Enumerator[O] {
      ...
      // look the code in github project
      ...
  }

The implementation just synchronizes the states of the Iteratee[O, A] consuming the Enumerator with the states of Process[Task, O] emitting data chunks of O. It’s quite simple actually.



Building Process1[I, O] from Iteratee[I, O]

The idea is to drive an Iteratee from a scalaz-stream Process so that it can consume an Enumerator and be used in Play WS.

An Iteratee[I, O] accepts inputs of type I (and nothing else) and will fold the input stream into a single result of type O.

A Process1[I, O] accepts inputs of type I and emits chunks of type O but not necessarily one single output chunk. So it’s a good candidate for our use-case but we need to choose which emitted chunk will be the result of the Iteratee[I, O]. here, totally arbitrarily, I’ve chosen to take the first emit as the result (but the last would be as good if not better).

So I implemented the following:

1
2
3
4
5
def iterateeFirstEmit[I, O](p: Process.Process1[I, O])(implicit ctx: ExecutionContext): Iteratee[I, O] = {
  ...
  // look the code in github project
  ...
}

The implementation is really raw for experimentation as it goes through the states of the Process1[I,O] and generates the corresponding states of Iteratee[I,O] until first emitted value. Nothing more nothing less…



A few basic action samples

Everything done in those samples could be done with Iteratee/Enumeratee more or less simply. The subject is not there!


Sample 1 : Generates a stream from a Simple Emitter Process

1
2
3
4
5
def sample1 = Action {
  val process = Process.emitAll(Seq(1, 2, 3, 4)).map(_.toString)

  Ok.stream(enumerator(process))
}
1
2
> curl "localhost:10000/sample1" --no-buffer
1234

Sample 2 : Generates a stream from a continuous emitter

1
2
3
4
5
6
7
/** A process generating an infinite stream of natural numbers */
val numerals = Process.unfold(0){ s => val x = s+1; Some(x, x) }.repeat

// we limit the number of outputs but you don't have it can stream forever...
def sample2 = Action {
  Ok.stream(enumerator(numerals.map(_.toString).intersperse(",").take(40)))
}
1
2
> curl "localhost:10000/sample2" --no-buffer
1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,

Sample 3 : Generates a stream whose output frequency is controlled by a tee with numeral generator on left and ticker on right

1
2
3
4
5
6
7
8
9
10
11
12
13
14
/** ticks constant every delay milliseconds */
def ticker(constant: Int, delay: Long): Process[Task, Int] = Process.await(
  scalaFuture2scalazTask(delayedNumber(constant, delay))
)(Process.emit).repeat

def sample3 = Action {
  Ok.stream(enumerator(
    // creates a Tee outputting only numerals but consuming ticker // to have the delayed effect
    (numerals tee ticker(0, 100))(processes.zipWith((a,b) => a))
      .take(100)
      .map(_.toString)
      .intersperse(",")
  ))
}

Please note :

  • scalaFuture2scalazTask is just a helper to convert a Future into Task
  • tickeris quite simple to understand: it awaits Task[Int] and emits thisInt and repeats it again…
  • processes.zipWith((a,b) => a) is a tee (2 inputs left/right) that outputs only left data but consumes right also to have the delay effect.
  • .map(_.toString) simply converts into something writeable by Ok.stream
  • .intersperse(",") which simply add `”,” between each element
1
2
3
4
5
6
> curl "localhost:10000/sample3" --no-buffer
1... // to simulate the progressive apparition of numbers on screen
1,...
1,2...
...
1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100

Sample 4 : Generates a stream using side-effect to control output frequency

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
/** Async generates this Int after delay*/
def delayedNumber(i: Int, delay: Long): Future[Int] =
  play.api.libs.concurrent.Promise.timeout(i, delay)

/** Creates a process generating an infinite stream natural numbers
  * every `delay milliseconds
  */
def delayedNumerals(delay: Long) = {
  def step(i: Int): Process[Task, Int] = {
    Process.emit(i).then(
      Process.await(scalaFuture2scalazTask(delayedNumber(i+1, delay)))(step)
    )
  }
  Process.await(scalaFuture2scalazTask(delayedNumber(0, delay)))(step)
}

def sample4 = Action {
  Ok.stream(enumerator(delayedNumerals(100).take(100).map(_.toString).intersperse(",")))
}

Please note:

  • delayedNumber uses an Akka scheduler to trigger our value after timeout
  • delayedNumerals shows a simple recursive `Process[Task, Int] construction which shouldn’t be too hard to understand
1
2
3
4
5
6
7
8
> curl "localhost:10000/sample4" --no-buffer
0... // to simulate the progressive apparition of numbers every 100ms
0,...
0,1...
0,1,...
0,1,2...
...
0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99

Sample 5 : Generates a stream by consuming completely another stream

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
// a process folding all Array[Byte] into a big String
val reader: Process.Process1[Array[Byte], String] = processes.fold1[Array[Byte]]((a, b) => a ++ b )
  .map{ arr => new String(arr) } |> processes.last

def sample5 = Action {
  // the WS call with response consumer by previous Process1[Array[Byte], String] driving the Iteratee[Array[Byte], String]
  val maybeValues: Future[String] =
    WS.url(routes.Application.sample2().absoluteURL())
      .get(rh => iterateeFirstEmit(reader))
      .flatMap(_.run)

  Ok.stream(enumerator(
    // wraps the received String in a Process
    // re-splits it to remove ","
    // emits all chunks
    Process.wrap(scalaFuture2scalazTask(maybeValues))
      .flatMap{ values => Process.emitAll(values.split(",")) }
  ))
}

Please note:

  • reader is a Process1[Array[Byte], String] that folds all receivedArray[Byte]into aString`
  • iterateeFirstEmit(reader) simulates an Iteratee[Array[Byte], String] driven by the reader process that will fold all chunks of data received from WS call to routes.Application.sample2()
  • .get(rh => iterateeFirstEmit(reader)) will return a Future[Iteratee[Array[Byte], String] that is run in .flatMap(_.run) to return a Future[String]
  • Process.wrap(scalaFuture2scalazTask(maybeValues)) is a trick to wrap the folded Future[String] into a Process[Task, String]
  • Process.emitAll(values.split(",")) splits the resulting string again and emits all chunks outside (stupid, just for demo)
1
2
> curl "localhost:10000/sample5" --no-buffer
1234567891011121314151617181920

Still there? Let’s dive deeper and be sharper!


Building recursive streaming action consuming itself


Hacking WS to consume & re-emit WS in realtime

WS.executeStream(r: ResponseHeader => Iteratee[Array[Byte], A]) is cool API because you can build an iteratee from the ResponseHeader and then the iteratee will consume received `Array[Byte] chunks in a reactive way and will fold them. The problem is that until the iteratee has finished, you won’t have any result.

But I’d like to be able to receive chunks of data in realtime and re-emit them immediately so that I can inject them in realtime data flow processing. WS API doesn’t allow this so I decided to hack it a bit. I’ve written WSZ which provides the API:

1
2
3
def getRealTime(): Process[Future, Array[Byte]]
// based on
private[libs] def realtimeStream: Process[Future, Array[Byte]]

This API outputs a realtime Stream of Array[Byte] whose flow is controlled by promises (Future) being redeemed in AsyncHttpClient AsyncHandler. I didn’t care about ResponseHeaders for this experimentation but it should be taken account in a more serious impl.

I obtain a Process[Future, Array[Byte]] streaming received chunks in realtime and I can then take advantage of the power of machines to manipulate the data chunks as I want.


Sample 6 : Generates a stream by forwarding/refolding another stream in realtime

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
/** A Process1 splitting input strings using splitter and re-grouping chunks */
def splitFold(splitter: String): Process.Process1[String, String] = {
  // the recursive splitter / refolder
  def go(rest: String)(str: String): Process.Process1[String, String] = {
    val splitted = str.split(splitter)
    println(s"""$str - ${splitted.mkString(",")} --""")
    (splitted.length match {
      case 0 =>
        // string == splitter
        // emit rest
        // loop
        Process.emit(rest).then( Process.await1[String].flatMap(go("")) )
      case 1 =>
        // splitter not found in string 
        // so waiting for next string
        // loop by adding current str to rest
        // but if we reach end of input, then we emit (rest+str) for last element
        Process.await1[String].flatMap(go(rest + str)).orElse(Process.emit(rest+str))
      case _ =>
        // splitter found
        // emit rest + splitted.head
        // emit all splitted elements but last
        // loops with rest = splitted last element
        Process.emit(rest + splitted.head)
               .then( Process.emitAll(splitted.tail.init) )
               .then( Process.await1[String].flatMap(go(splitted.last)) )
    })
  }
  // await1 simply means "await an input string and emits it"
  Process.await1[String].flatMap(go(""))
}

def sample6 = Action { implicit request =>
  val p = WSZ.url(routes.Application.sample4().absoluteURL()).getRealTime.translate(Task2FutureNT)

  Ok.stream(enumerator(p.map(new String(_)) |> splitFold(",")))
}

Please note:

  • def splitFold(splitter: String): Process.Process1[String, String] is just a demo that coding a Process transducer isn’t so crazy… Look at comments in code
  • .translate(Task2FutureNF) converts the Process[Future, Array[Byte]] to Process[Task, Array[Byte]] using Scalaz Natural Transformation.
  • p |> splitFold(",") means “pipe output of process p to input of splitFold”.
1
2
3
4
5
6
> curl "localhost:10000/sample6" --no-buffer
0...
01...
012...
...
01234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162636465666768697071727374757677787980818283848586878889909192939495969798

Let’s finish our trip with a bit of puzzle and mystery.


THE FINAL MYSTERY: recursive stream generating Fibonacci series

As soon as my first experimentations of scalaz-stream with Play were operational, I’ve imagined an interesting case:

Is it possible to build an action generating a stream of data fed by itself: a kind of recursive stream.

With Iteratee, it’s not really possible since it can’t emit data before finishing iteration. It would certainly be possible with an Enumeratee but the API doesn’t exist and I find it much more obvious with scalaz-stream API!

The mystery isn’t in the answer to my question: YES it is possible!

The idea is simple:

  • Create a simple action
  • Create a first process emitting a few initialization data
  • Create a second process which consumes the WS calling my own action and re-emits the received chunks in realtime
  • Append first process output and second process output
  • Stream global output as a result of the action which will back-propagated along time to the action itself…

Naturally, if it consumes its own data, it will recall itself again and again and again until you reach the connections or opened file descriptors limit. As a consequence, you must limit the depth of recursion.

I performed different experiences to show this use-case by zipping the stream with itself, adding elements with themselves etc… And after a few tries, I implemented the following code quite fortuitously :

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
/** @param curDepth the current recursion depth
  * @param maxDepth the max recursion depth
  */
def sample7(curDepth: Int, maxDepth: Int) = Action { implicit request =>

  // initializes serie with 2 first numerals output with a delay of 100ms
  val init: Process[Task, String] = delayedNumerals(100).take(2).map(_.toString)

  // Creates output Process
  // If didn't reach maxDepth, creates a process consuming my own action
  // If reach maxDepth, just emit 0
  val outputProcess =
    if(curDepth < maxDepth) {
      // calling my own action and streaming chunks using getRealTime

      val myself = WSZ.url(
        routes.Application.sample7(curDepth+1, maxDepth).absoluteURL()
      ).getRealTime.translate(Task2FutureNT).map(new String(_))
      // splitFold isn't useful, just for demo
      |> splitFold(",")

      // THE IMPORTANT PART BEGIN
      // appends `init` output with `myself` output
      // pipe it through a helper provided scalaz-stream `processes.sum[Long]`
      // which sums elements and emits partial sums
      ((init append myself).map(_.toLong) |> processes.sum[Long])
      // THE IMPORTANT PART END
      // just for output format
      .map(_.toString).intersperse(",")
    }
    else Process.emit(0).map(_.toString)

  Ok.stream(enumerator(outputProcess))
}

Launch it:

1
2
curl "localhost:10000/sample7?curDepth=0&maxDepth=10" --no-buffer
0,1,1,2,3,5,8,13,21,34,55,89,144,233,377,610,987,1597,2584,4181,6765

WTF??? This is Fibonacci series?

Just to remind you about it:

1
2
3
e(0) = 0
e(1) = 1
e(n) = e(n-1) + e(n-2)

Here is the mystery!!!

How does it work???

I won’t tell the answer to this puzzling side-effect and let you think about it and discover why it works XD

But this sample shows exactly what I wanted: Yes, it’s possible to feed an action with its own feed! Victory!



Conclusion

Ok all of that was really funky but is it useful in real projects? I don’t really know yet but it provides a great proof of the very reactive character of scalaz-stream and Play too!

I tend to like scalaz-stream and I feel more comfortable, more natural using Process than Iteratee right now… Maybe this is just an impression so I’ll keep cautious about my conclusions for now…

All of this code is just experimental so be aware about it. If you like it and see that it could be useful, tell me so that we create a real library from it!

Have Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun, Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun, Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,Fun,!



PostScriptum

A few more details about Iteratees

Here are a few things that bother me when I use Play Iteratee (you don’t have to agree, this is very subjective):

  • Enumeratees are really powerful (maybe the most powerful part of the API) but they can be tricky: for ex, defining a new Enumeratee from scratch isn’t easy at first sight due to the signature of the Enumeratee itself, Enumeratee composes differently on left (with Enumerators) and on right (with Iteratees) and it can be strange at beginning…
  • Enumerators are not defined (when not using helpers) in terms of the data they produce but with respect to the way an Iteratee will consume the data they will produce. You must somewhat reverse your way of thinking which is not so natural.
  • Iteratees are great to produce one result by folding a stream of data but if you want to consume/cut/aggregate/re-emit the chunks, the code you write based on Iteratee/Enumeratee quickly becomes complex, hard to re-read and edge cases (error, end of stream) are hard to treat.
  • When you want to manipulate multiple streams together, zip/interleave them, you must write very complex code too.
  • End of iteration and Error management with Iteratees isn’t really clear IMHO and when you begin to compose Iteratees together, it becomes hard to know what will happen…
  • If you want to manipulate a stream with side-effecting, you can do it with Enumeratees but it’s not so obvious…




Now you should use play-autosource 2.0 correcting a few issues & introducing ActionBuilder from play2.2


The code for all autosources & sample apps can be found on Github here

Brand New Autosources

Play AutoSource now have 2 more implementations :

One month ago, I’ve demo’ed the concept of Autosource for Play2/Scala with ReactiveMongo in this article. ReactiveMongo was the perfect target for this idea because it accepts Json structures almost natively for both documents manipulation and queries.

But how does the concept behave when applied on a DB for which data are constrained by a schema and for which queries aren’t Json.


Using Datomisca-Autosource in your Play project

Add following lines to your project/Build.scala

1
2
3
4
5
6
7
8
9
10
11
12
13
14
val mandubianRepo = Seq(
  "Mandubian repository snapshots" at "https://github.com/mandubian/mandubian-mvn/raw/master/snapshots/",
  "Mandubian repository releases" at "https://github.com/mandubian/mandubian-mvn/raw/master/releases/"
)

val appDependencies = Seq()

val main = play.Project(appName, appVersion, appDependencies).settings(
  resolvers ++= mandubianRepo,
  libraryDependencies ++= Seq(
    "play-autosource"   %% "datomisca"       % "1.0",
    ...
  )
)

Create your Model + Schema

With ReactiveMongo Autosource, you could create a pure blob Autosource using JsObject without any supplementary information. But with Datomic, it’s not possible because Datomic forces to use a schema for your data.

We could create a schema and manipulate JsObject directly with Datomic and some Json validators. But I’m going to focus on the static models because this is the way people traditionally interact with a Schema-constrained DB.

Let’s create our model and schema.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
// The Model (with characters pointing on Datomic named entities)
case class Person(name: String, age: Long, characters: Set[DRef])

// The Schema written with Datomisca
object Person {
  // Namespaces
  val person = new Namespace("person") {
    val characters = Namespace("person.characters")
  }

  // Attributes
  val name       = Attribute(person / "name",       SchemaType.string, Cardinality.one) .withDoc("Person's name")
  val age        = Attribute(person / "age",        SchemaType.long,   Cardinality.one) .withDoc("Person's age")
  val characters = Attribute(person / "characters", SchemaType.ref,    Cardinality.many).withDoc("Person's characterS")

  // Characters named entities
  val violent = AddIdent(person.characters / "violent")
  val weak    = AddIdent(person.characters / "weak")
  val clever  = AddIdent(person.characters / "clever")
  val dumb    = AddIdent(person.characters / "dumb")
  val stupid  = AddIdent(person.characters / "stupid")

  // Schema
  val schema = Seq(
    name, age, characters,
    violent, weak, clever, dumb, stupid
  )

Create Datomisca Autosource

Now that we have our schema, let’s write the autosource.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
import datomisca._
import Datomic._

import play.autosource.datomisca._

import play.modules.datomisca._
import Implicits._

import scala.concurrent.ExecutionContext.Implicits.global
import play.api.Play.current

import models._
import Person._

object Persons extends DatomiscaAutoSourceController[Person] {
  // gets the Datomic URI from application.conf
  val uri = DatomicPlugin.uri("mem")

  // ugly DB initialization ONLY for test purpose
  Datomic.createDatabase(uri)

  // Datomic connection is required
  override implicit val conn = Datomic.connect(uri)
  // Datomic partition in which you store your entities
  override val partition = Partition.USER

  // more than ugly schema provisioning, ONLY for test purpose
  Await.result(
    Datomic.transact(Person.schema),
    Duration("10 seconds")
  )

}

Implementing Json <-> Person <-> Datomic transformers

If you compile previous code, you should have following error:

1
could not find implicit value for parameter datomicReader: datomisca.EntityReader[models.Person]

Actually, Datomisca Autosource requires 4 elements to work:

  • Json.Format[Person] to convert Person instances from/to Json (network interface)
  • EntityReader[Person] to convert Person instances from Datomic entities (Datomic interface)
  • PartialAddEntityWriter[Person] to convert Person instances to Datomic entities (Datomic interface)
  • Reads[PartialAddEntity] to convert Json to PartialAddEntity which is actually a simple map of fields/values to partially update an existing entity (one single field for ex).

It might seem more complicated than in ReactiveMongo but there is nothing different. The autosource converts Person from/to Json and then converts Person from/to Datomic structure ie PartialAddEntity. In ReactiveMongo, the only difference is that it understands Json so well that static model becomes unnecessary sometimes ;)…

Let’s define those elements in Person companion object.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
object Person {
...
  // Classic Play2 Json Reads/Writes
  implicit val personFormat = Json.format[Person]

  // Partial entity update : Json to PartialAddEntity Reads
  implicit val partialUpdate: Reads[PartialAddEntity] = (
    ((__ \ 'name).read(readAttr[String](Person.name)) orElse Reads.pure(PartialAddEntity(Map.empty))) and
    ((__ \ 'age) .read(readAttr[Long](Person.age)) orElse Reads.pure(PartialAddEntity(Map.empty)))  and
    // need to specify type because a ref/many can be a list of dref or entities so need to tell it explicitly
    (__ \ 'characters).read( readAttr[Set[DRef]](Person.characters) )
    reduce
  )

  // Entity Reads (looks like Json combinators but it's Datomisca combinators)
  implicit val entity2Person: EntityReader[Person] = (
    name      .read[String]   and
    age       .read[Long]     and
    characters.read[Set[DRef]]
  )(Person.apply _)

  // Entity Writes (looks like Json combinators but it's Datomisca combinators)
  implicit val person2Entity: PartialAddEntityWriter[Person] = (
    name      .write[String]   and
    age       .write[Long]     and
    characters.write[Set[DRef]]
  )(DatomicMapping.unlift(Person.unapply))

...
}

Now we have everything to work except a few configurations.

Add AutoSource routes at beginning conf/routes

1
->      /person                     controllers.Persons

Create conf/play.plugins to initialize Datomisca Plugin

1
400:play.modules.datomisca.DatomicPlugin

Append to conf/application.conf to initialize MongoDB connection

1
datomisca.uri.mem="datomic:mem://mem"

Insert your first 2 persons with Curl

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
>curl -X POST -d '{ "name":"bob", "age":25, "characters": ["person.characters/stupid", "person.characters/violent"] }' --header "Content-Type:application/json" http://localhost:9000/persons --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 21

{"id":17592186045423} -> oh a Datomic ID

>curl -X POST -d '{ "name":"john", "age":43, "characters": ["person.characters/clever", "person.characters/weak"] }' --header "Content-Type:application/json" http://localhost:9000/persons --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 21

{"id":17592186045425}

Querying is the biggest difference in Datomic

In Datomic, you can’t do a getAll without providing a Datomic Query.

But what is a Datomic query? It’s inspired by Datalog which uses predicates to express the constraints on the searched entities. You can combine predicates together.

With Datomisca Autosource, you can directly send datalog queries in the query parameter q for GET or in body for POST with one restriction: your query can’t accept input parameters and must return only the entity ID. For ex:

[ :find ?e :where [ ?e :person/name "john"] ] --> OK

[ :find ?e ?name :where [ ?e :person/name ?name] ] --> KO

Let’s use it by finding all persons.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
>curl -X POST --header "Content-Type:text/plain" -d '[:find ?e :where [?e :person/name]]' 'http://localhost:9000/persons/find' --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 231

[
    {
        "name": "bob",
        "age": 25,
        "characters": [
            ":person.characters/violent",
            ":person.characters/stupid"
        ],
        "id": 17592186045423
    },
    {
        "name": "john",
        "age": 43,
        "characters": [
            ":person.characters/clever",
            ":person.characters/weak"
        ],
        "id": 17592186045425
    }
]

Please note the use of POST here instead of GET because Curl doesn’t like [] in URL even using -g option

Now you can use all other routes provided by Autosource

Autosource Standard Routes

Get / Find / Stream

  • GET /persons?… -> Find by query
  • GET /persons/ID -> Find by ID
  • GET /persons/stream -> Find by query & stream result by page

Insert / Batch / Find

  • POST /persons + BODY -> Insert
  • POST /persons/find + BODY -> find by query (when query is too complex to be in a GET)
  • POST /persons/batch + BODY -> batch insert (multiple)

Update / batch

  • PUT /persons/ID + BODY -> Update by ID
  • PUT /persons/ID/partial + BODY -> Update partially by ID
  • PUT /persons/batch -> batch update (multiple)

Delete / Batch

  • DELETE /persons/ID -> delete by ID
  • DELETE /persons/batch + BODY -> batch delete (multiple)


Conclusion

Play-Autosource’s ambition was to be DB agnostic (as much as possible) and showing that the concept can be applied to schemaless DB (ReactiveMongo & CouchDB) and schema DB (Datomic) is a good sign it can work. Naturally, there are a few more elements to provide for Datomic than in ReactiveMongo but it’s useful anyway.

Thank to @TrevorReznik for his contribution of CouchBase Autosource.

I hope to see soon one for Slick and a few more ;)

Have Autofun!





EXPERIMENTAL / DRAFT


Do you remember JsPath pattern matching presented in this article ?

Let’s now go further with something that you should enjoy even more: Json Interpolation & Pattern Matching.

I’ve had the idea of these features for some time in my mind but let’s render unto Caesar what is Caesar’s : Rapture.io proved that it could be done quite easily and I must say I stole got inspired by a few implementation details from them! (specially the @inline implicit conversion for string interpolation class which is required due to a ValueClass limitation that should be removed in further Scala versions)

First of all, code samples as usual…

Create JsValue using String interpolation

1
2
3
4
5
6
7
8
9
10
11
scala> val js = json"""{ "foo" : "bar", "foo2" : 123 }"""
js: play.api.libs.json.JsValue = {"foo":"bar","foo2":123}

scala> js == Json.obj("foo" -> "bar", "foo2" -> 123)
res1: Boolean = true

scala> val js = json"""[ 1, true, "foo", 345.234]"""
js: play.api.libs.json.JsValue = [1,true,"foo",345.234]

scala> js == Json.arr(1, true, "foo", 345.234)
res2: Boolean = true

Yes, pure Json in a string…

How does it work? Using String interpolation introduced in Scala 2.10.0 and Jackson for the parsing…

In String interpolation, you can also put Scala variables directly in the interpolated string. You can do the same in Json interpolation.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
scala> val alpha = "foo"
alpha: String = foo

scala> val beta = 123L
beta: Long = 123

scala> val js = json"""{ "alpha" : "$alpha", "beta" : $beta}"""
js: play.api.libs.json.JsValue = {"alpha":"foo","beta":123}

scala> val gamma = Json.arr(1, 2, 3)
gamma: play.api.libs.json.JsArray = [1,2,3]

scala> val delta = Json.obj("key1" -> "value1", "key2" -> "value2")
delta: play.api.libs.json.JsObject = {"key1":"value1","key2":"value2"}

scala> val js = json"""
     |         {
     |           "alpha" : "$alpha",
     |           "beta" : $beta,
     |           "gamma" : $gamma,
     |           "delta" : $delta,
     |           "eta" : {
     |             "foo" : "bar",
     |             "foo2" : [ "bar21", 123, true, null ]
     |           }
     |         }
     |       """
js: play.api.libs.json.JsValue = {"alpha":"foo","beta":123,"gamma":[1,2,3],"delta":{"key1":"value1","key2":"value2"},"eta":{"foo":"bar","foo2":["bar21",123,true,null]}}

Please note that string variables must be put between "..." because without it the parser will complain.

Ok, so now it’s really trivial to write Json, isn’t it?

String interpolation just replaces the string you write in your code by some Scala code concatenating pieces of strings with variables as you would write yourself. Kind-of: s"toto ${v1} tata" => "toto + v1 + " tata" + ...

But at compile-time, it doesn’t compile your String into Json: the Json parsing is done at runtime with string interpolation. So using Json interpolation doesn’t provide you with compile-time type safety and parsing for now.

In the future, I may replace String interpolation by a real Macro which will also parse the string at compile-time. Meanwhile, if you want to rely on type-safety, go on using Json.obj / Json.arr API.


Json pattern matching

What is one of the first feature that you discover when learning Scala and that makes you say immediately: “Whoaa Cool feature”? Pattern Matching.

You can write:

1
2
3
4
5
6
7
8
9
10
11
12
13
scala> val opt = Option("toto")
opt: Option[String] = Some(toto)

scala> opt match {
  case Some(s) => s"not empty option:$s"
  case None    => "empty option"
}
res2: String = not empty option:toto

// or direct variable assignement using pattern matching

scala> val Some(s) = opt
s: String = toto

Why not doing this with Json?

And…. Here it is with Json pattern matching!!!

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
scala> val js = Json.obj("foo" -> "bar", "foo2" -> 123L)
js: play.api.libs.json.JsObject = {"foo":"bar","foo2":123}

scala> js match {
  case json"""{ "foo" : $a, "foo2" : $b }""" => Some(a -> b)
  case _ => None
}
res5: Option[(play.api.libs.json.JsValue, play.api.libs.json.JsValue)] =
Some(("bar",123))

scala> val json"""{ "foo" : $a, "foo2" : $b}""" = json""" { "foo" : "bar", "foo2" : 123 }"""
a: play.api.libs.json.JsValue = "bar"
b: play.api.libs.json.JsValue = 123

scala> val json"[ $v1, 2, $v2, 4 ]" = Json.arr(1, 2, 3, 4)
v1: play.api.libs.json.JsValue = 1
v2: play.api.libs.json.JsValue = 3

Magical?

Not at all… Just unapplySeq using the tool that enables this kind of Json manipulation as trees: JsZipper

The more I use JsZippers, the more I find fields where I can use them ;)


More complex Json pattern matching

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
scala> val js = json"""{
    "key1" : "value1",
    "key2" : [
      "alpha",
      { "foo" : "bar",
        "foo2" : {
          "key21" : "value21",
          "key22" : [ "value221", 123, false ]
        }
      },
      true,
      123.45
    ]
  }"""
js: play.api.libs.json.JsValue = {"key1":"value1","key2":["alpha",{"foo":"bar","foo2":{"key21":"value21","key22":["value221",123,false]}},true,123.45]}

scala> val json"""{ "key1" : $v1, "key2" : ["alpha", $v2, true, $v3] }""" = js
v1: play.api.libs.json.JsValue = "value1"
v2: play.api.libs.json.JsValue = {"foo":"bar","foo2":{"key21":"value21","key22":["value221",123,false]}}
v3: play.api.libs.json.JsValue = 123.45

scala> js match {
    case json"""{
      "key1" : "value1",
      "key2" : ["alpha", $v1, true, $v2]
    }"""   => Some(v1, v2)
    case _ => None
  }
res9: Option[(play.api.libs.json.JsValue, play.api.libs.json.JsValue)] =
Some(({"foo":"bar","foo2":{"key21":"value21","key22":["value221",123,false]}},123.45))

// A non matching example maybe ? ;)
scala>  js match {
    case json"""{
      "key1" : "value1",
      "key2" : ["alpha", $v1, false, $v2]
    }"""   => Some(v1, v2)
    case _ => None
  }
res10: Option[(play.api.libs.json.JsValue, play.api.libs.json.JsValue)] = None

If you like that, please tell it so that I know whether it’s worth pushing it to Play Framework!


Using these features right now in a Scala/SBT project

These features are part of my experimental project JsZipper presented in this article.

To use it, add following lines to your SBT Build.scala:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
object ApplicationBuild extends Build {
  ...
  val mandubianRepo = Seq(
    "Mandubian repository snapshots" at "https://github.com/mandubian/mandubian-mvn/raw/master/snapshots/",
    "Mandubian repository releases" at "https://github.com/mandubian/mandubian-mvn/raw/master/releases/"
  )
  ...

  val main = play.Project(appName, appVersion, appDependencies).settings(
    resolvers ++= mandubianRepo,
    libraryDependencies ++= Seq(
      ...
      "play-json-zipper"  %% "play-json-zipper"    % "0.1-SNAPSHOT",
      ...
    )
  )
  ...
}

In your Scala code, import following packages

1
2
3
4
import play.api.libs.json._
import syntax._
import play.api.libs.functional.syntax._
import play.api.libs.json.extensions._

PatternMatch your fun!





Now you should use play-autosource 2.0 correcting a few issues & introducing ActionBuilder from play2.2


The module code and sample app can be found on Github here


Here we go:

0’ : Create App

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
> play2 new auto-persons
       _            _
 _ __ | | __ _ _  _| |
| '_ \| |/ _' | || |_|
|  __/|_|\____|\__ (_)
|_|            |__/

play! 2.1.1 (using Java 1.7.0_21 and Scala 2.10.0), http://www.playframework.org

The new application will be created in /Users/pvo/zenexity/workspaces/workspace_mandubian/auto-persons

What is the application name? [auto-persons]
>

Which template do you want to use for this new application?

  1             - Create a simple Scala application
  2             - Create a simple Java application

> 1
OK, application auto-persons is created.

Have fun!

10’ : edit project/Build.scala, add play-autosource:reactivemongo dependency

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
val mandubianRepo = Seq(
  "Mandubian repository snapshots" at "https://github.com/mandubian/mandubian-mvn/raw/master/snapshots/",
  "Mandubian repository releases" at "https://github.com/mandubian/mandubian-mvn/raw/master/releases/"
)

val appDependencies = Seq()

val main = play.Project(appName, appVersion, appDependencies).settings(
  resolvers ++= mandubianRepo,
  libraryDependencies ++= Seq(
    "play-autosource"   %% "reactivemongo"       % "1.0-SNAPSHOT",
    "org.specs2"        %% "specs2"              % "1.13"        % "test",
    "junit"              % "junit"               % "4.8"         % "test"
  )
)

30’ : Create new ReactiveMongo AutoSource Controller in app/Person.scala

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
package controllers

import play.api._
import play.api.mvc._

// BORING IMPORTS
// Json
import play.api.libs.json._
import play.api.libs.functional.syntax._
// Reactive JSONCollection
import play.modules.reactivemongo.json.collection.JSONCollection
// Autosource
import play.autosource.reactivemongo._
// AutoSource is Async so imports Scala Future implicits
import scala.concurrent.ExecutionContext.Implicits.global
import play.api.Play.current

// >>> THE IMPORTANT PART <<<
object Persons extends ReactiveMongoAutoSourceController[JsObject] {
  val coll = db.collection[JSONCollection]("persons")
}

50’ : Add AutoSource routes at beginning conf/routes

1
->      /person                     controllers.Persons

60’ : Create conf/play.plugins to initialize ReactiveMongo Plugin

1
400:play.modules.reactivemongo.ReactiveMongoPlugin

70’ : Append to conf/application.conf to initialize MongoDB connection

1
mongodb.uri ="mongodb://localhost:27017/persons"

80’ : Launch application

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
> play2 run

[info] Loading project definition from /.../auto-persons/project
[info] Set current project to auto-persons (in build file:/.../auto-persons/)

[info] Updating {file:/.../auto-persons/}auto-persons...
[info] Done updating.
--- (Running the application from SBT, auto-reloading is enabled) ---

[info] play - Listening for HTTP on /0:0:0:0:0:0:0:0:9000

(Server started, use Ctrl+D to stop and go back to the console...)
[info] Compiling 5 Scala sources and 1 Java source to /.../auto-persons/target/scala-2.10/classes...
[warn] there were 2 feature warnings; re-run with -feature for details
[warn] one warning found
[success] Compiled in 6s

100’ : Insert your first 2 persons with Curl

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
>curl -X POST -d '{ "name":"bob", "age":25 }' --header "Content-Type:application/json" http://localhost:9000/person --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 33

{"id":"51b868ef31d4002c0bac8ba4"} -> oh a BSONObjectId

>curl -X POST -d '{ "name":"john", "age":43 }' --header "Content-Type:application/json" http://localhost:9000/person --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 33

{"id":"51b868fa31d4002c0bac8ba5"}

110’ : Get all persons

1
2
3
4
5
6
7
8
9
10
>curl http://localhost:9000/person --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 118

[
  {"name":"bob","age":25.0,"id":"51b868ef31d4002c0bac8ba4"},
  {"name":"john","age":43.0,"id":"51b868fa31d4002c0bac8ba5"}
]

115’ : Delete one person

1
2
3
4
5
6
7
>curl -X DELETE http://localhost:9000/person/51b868ef31d4002c0bac8ba4 --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 33

{"id":"51b868ef31d4002c0bac8ba4"}

120’ : Verify person was deleted

1
2
3
4
5
6
7
>curl -X GET http://localhost:9000/person/51b868ef31d4002c0bac8ba4 --include

HTTP/1.1 404 Not Found
Content-Type: text/plain; charset=utf-8
Content-Length: 37

ID 51b868ef31d4002c0bac8ba4 not found

125’ : Update person

1
2
3
4
5
6
7
>curl -X PUT -d '{ "name":"john", "age":35 }' --header "Content-Type:application/json" http://localhost:9000/person/51b868fa31d4002c0bac8ba5 --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 33

{"id":"51b868fa31d4002c0bac8ba5"}

130’ : Batch insert 2 persons (johnny & tom) with more properties

1
2
3
4
5
6
7
>curl -X POST -d '[{ "name":"johnny", "age":15, "address":{"city":"Paris", "street":"rue quincampoix"} },{ "name":"tom", "age":3, "address":{"city":"Trifouilly", "street":"rue des accidents de poucettes"} }]' --header "Content-Type:application/json" http://localhost:9000/person/batch --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 8

{"nb":2}

135’ : Get all persons whose name begins by “john”

1
2
3
4
5
6
7
8
9
10
>curl -X POST -d '{"name":{"$regex":"^john"}}' --header "Content-Type:application/json" http://localhost:9000/person/find --include

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Content-Length: 175

[
  {"name":"john","age":35.0,"id":"51b868fa31d4002c0bac8ba5"},
  {"id":"51b86a1931d400bc01ac8ba8","name":"johnny","age":15.0,"address":{"city":"Paris","street":"rue quincampoix"}}
]

140’ : Delete all persons

1
2
3
4
5
6
7
>curl -X DELETE -d '{}' --header "Content-Type:application/json" http://localhost:9000/person/batch --include

HTTP/1.1 200 OK
Content-Type: text/plain; charset=utf-8
Content-Length: 7

deleted

145’ : Take 5’ rest


150’ : Done



So what was demonstrated here?

With Play-Autosource, in a few lines, you obtain :

  • A backed abstract datasource (here implemented for ReactiveMongo but it could be implemented for other DBs)
  • All CRUD operations are exposed as pure REST services
  • The datasource is typesafe (here JsObject but we’ll show later that we can use any type)

It can be useful to kickstart any application in which you’re going to work iteratively on our data models in direct interaction with front-end.

It could also be useful to Frontend developers who need to bootstrap frontend code with Play Framework application backend. With Autosource, they don’t have to care about modelizing strictly a datasource on server-side and can dig into their client-side code quite quickly.



Adding constraints & validation

Now you tell me: “Hey that’s stupid, you store directly JsObject but my data are structured and must be validated before inserting them”

Yes you’re right so let’s add some type constraints on our data:

1
2
3
4
5
6
7
8
9
10
object Persons extends ReactiveMongoAutoSourceController[JsObject] {
  val coll = db.collection[JSONCollection]("persons")

  // we validate the received Json as JsObject because the autosource type is JsObject
  // and we add classic validations on types
  override val reader = __.read[JsObject] keepAnd (
    (__ \ "name").read[String] and
    (__ \ "age").read[Int](Reads.min(0) keepAnd Reads.max(117))
  ).tupled
}

Try it now:

1
2
3
4
5
6
7
curl -X POST -d '{ "nameXXX":"bob", "age":25 }' --header "Content-Type:application/json" http://localhost:9000/person --include

HTTP/1.1 400 Bad Request
Content-Type: application/json; charset=utf-8
Content-Length: 62

{"obj.name":[{"msg":"validate.error.missing-path","args":[]}]}

You can add progressively constraints on your data in a few lines. With AutoSource, you don’t need to determine immediately the exact shape of your models and you can work with JsObject directly as long as you need. Sometimes, you’ll even discover that you don’t even need a structured model and JsObject will be enough. (but I also advise to design a bit things before implementing ;))

Keep in mind that our sample is based on an implementation for ReactiveMongo so using Json is natural. For other DB, other data structure might be more idiomatic…



Use typesafe models

Now you tell me: “Funny but but but JsObject is evil because it’s not strict enough. I’m a OO developer (maybe abused by ORM gurus when I was young) and my models are case-classes…”

Yes you’re right, sometimes, you need more business logic or you want to separate concerns very strictly and your model will be shaped as case-classes.

So let’s replace our nice little JsObject by a more serious case class.

1
2
3
4
5
6
7
8
9
10
11
// the model
case class Person(name: String, age: Int)
object Person{
  // the famous Json Macro which generates at compile-time a Reads[Person] in a one-liner
  implicit val fmt = Json.format[Person]
}

// The autosource... shorter than before
object Persons extends ReactiveMongoAutoSourceController[Person] {
  val coll = db.collection[JSONCollection]("persons")
}

Please note that I removed the validations I had introduced before because there are not useful anymore: using Json macros, I created an implicit Format[Person] which is used implicitly by AutoSource.

So, now you can see why I consider AutoSource as a typesafe datasource.



Let’s be front-sexy with AngularJS

You all know that AngularJS is the new kid on the block and that you must use it if you want to be sexy nowadays.

I’m already sexy so I must be able to use it without understanding anything to it and that’s exactly what I’ve done: in 30mn without knowing anything about Angular (but a few concepts), I wrote a dumb CRUD front page plugged on my wonderful AutoSource.


Client DS in app/assets/javascripts/persons.js

This is the most important part of this sample: we need to call our CRUD autosource endpoints from angularJS.

We are going to use Angular resources for it even if it’s not really the best feature of AngularJS. Anyway, in a few lines, it works pretty well in my raw case.

(thanks to Paul Dijou for reviewing this code because I repeat I don’t know angularJS at all and I wrote this in 20mn without trying to understand anything :D)

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
var app =
  // injects ngResource
  angular.module("app", ["ngResource"])
  // creates the Person factory backed by our autosource
  // Please remark the url person/:id which will use transparently our CRUD AutoSource endpoints
  .factory('Person', ["$resource", function($resource){
    return $resource('person/:id', { "id" : "@id" });
  }])
  // creates a controller
  .controller("PersonCtrl", ["$scope", "Person", function($scope, Person) {

    $scope.createForm = {};

    // retrieves all persons
    $scope.persons = Person.query();

    // creates a person using createForm and refreshes list
    $scope.create = function() {
      var person = new Person({name: $scope.createForm.name, age: $scope.createForm.age});
      person.$save(function(){
        $scope.createForm = {};
        $scope.persons = Person.query();
      })
    }

    // removes a person and refreshes list
    $scope.remove = function(person) {
      person.$remove(function() {
        $scope.persons = Person.query();
      })
    }

    // updates a person and refreshes list
    $scope.update = function(person) {
      person.$save(function() {
        $scope.persons = Person.query();
      })
    }
}]);

CRUD UI in index.scala.html

Now let’s create our CRUD UI page using angular directives. We need to be able to:

  • list persons
  • update/delete each person
  • create new persons
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
@(message: String)

@main("Welcome to Play 2.1") {

    <div ng-controller="PersonCtrl">
      <!-- create form -->
      <label for="name">name:</label><input ng-model="createForm.name"/>
      <label for="age">age:</label><input ng-model="createForm.age" type="number"/>
      <button ng-click="create()">Create new person</button>
      <hr/>
      <!-- List of persons with update/delete buttons -->
      <table>
      <thead><th>name</th><th>age</th><td>actions</td></thead>
      <tbody ng-repeat="person in persons">
        <tr>
          <td><input ng-model="person.name"/></td>
          <td><input type="number" ng-model="person.age"/></td>
          <td><button ng-click="update(person)">Update</button><button ng-click="remove(person)">Delete</button></td>
        </tr>
      </tbody>
      </div>
    </div>

}

Import Angular in main.scala.html

We need to import angularjs in our application and create angular application using ng-app

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
@(title: String)(content: Html)

<!DOCTYPE html>

<!-- please note the directive ng-app to initialize angular app-->
<html ng-app="app">
    <head>
        <title>@title</title>
        <link rel="stylesheet" media="screen" href="@routes.Assets.at("stylesheets/main.css")">
        <link rel="shortcut icon" type="image/png" href="@routes.Assets.at("images/favicon.png")">
        <script src="@routes.Assets.at("javascripts/jquery-1.9.0.min.js")" type="text/javascript"></script>
        <script src="https://ajax.googleapis.com/ajax/libs/angularjs/1.1.5/angular.min.js"></script>
        <script src="https://ajax.googleapis.com/ajax/libs/angularjs/1.1.5/angular-resource.min.js"></script>

        <script src="@routes.Assets.at("javascripts/person.js")" type="text/javascript"></script>
    </head>
    <body>
        @content
    </body>
</html>

What else??? Oh yes Security…

I know what you think: “Uhuh, the poor guy who exposes his DB directly on the network and who is able to delete everything without any security”

Once again, you’re right. (yes I know I love flattery)

Autosource is by default not secured in any way and actually I don’t really care about security because this is your job to secure your exposed APIs and there are so many ways to secure services that I prefer to let you choose the one you want.

Anyway, I’m a nice boy and I’m going to show you how you could secure the DELETE endpoint using the authentication action composition sample given in Play Framework documentation.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
// FAKE USER class to simulate a user extracted from DB.
case class User(name: String)
object User {
  def find(name: String) = Some(User(name))
}

object Persons extends ReactiveMongoAutoSourceController[Person] {
  // The action composite directly copied for PlayFramework doc
  def Authenticated(action: User => EssentialAction): EssentialAction = {
    // Let's define a helper function to retrieve a User
    def getUser(request: RequestHeader): Option[User] = {
      request.session.get("user").flatMap(u => User.find(u))
    }

    // Now let's define the new Action
    EssentialAction { request =>
      getUser(request).map(u => action(u)(request)).getOrElse {
        Done(Unauthorized)
      }
    }
  }

  val coll = db.collection[JSONCollection]("persons")

  // >>> IMPORTANT PART <<<
  // We simply override the delete action
  // If authenticated, we call the original action
  override def delete(id: BSONObjectID) = Authenticated { _ =>
    super.delete(id)
  }

  def index = Action {
    Ok(views.html.index("ok"))
  }

  // the login action which log any user
  def login(name: String) = Action {
    Ok("logged in").withSession("user" -> name)
  }

  // the logout action which log out any user
  def logout = Action {
    Ok("logged out").withNewSession
  }
}

Nothing to complicated here. If you need to add headers in your responses and params to querystring, it’s easy to wrap autosource actions. Please refer to Play Framework doc for more info…

I won’t try it here, the article is already too long but it should work…



Play-Autosource is DB agnostic

Play-Autosource Core is independent of the DB and provides Reactive (Async/Nonblocking) APIs to fulfill PlayFramework requirements.

Naturally this 1st implementation uses ReactiveMongo which is one of the best sample of DB reactive driver. MongoDB fits very well in this concept too because document records are really compliant to JSON datasources.

But other implementations for other DB can be done and I count on you people to contribute them.

DB implementation contributions are welcome (Play-Autosource is just Apache2 licensed) and AutoSource API are subject to evolutions if they appear to be erroneous.



Conclusion

Play-Autosource provides a very fast & lightweight way to create a REST CRUD typesafe datasource in your Play/Scala application. You can begin with blob data such as JsObject and then elaborate the model of your data progressively by adding constraints or types to it.

There would be many more things to say about Play/Autosource:

  • you can also override writers to change output format
  • you have some alpha streaming API also
  • etc…

There are also lots of features to improve/add because it’s still a very draft module.

If you like it and have ideas, don’t hesitate to discuss, to contribute, to improve etc…

curl -X POST -d "{ "coding" : "Have fun"} http://localhost:9000/developer

PS: Thanks to James Roper for his article about advanced routing in Play Framework which I copied shamefully XD





EXPERIMENTAL / DRAFT

The sample app can be found on Github here


Hi again folks!

Now, you may certainly have realized I’m Play2.1 Json API advocate. But you may also have understood that I’m not interested in Json as an end in itself. What catches my attention is that it’s a versatile arborescent data structure that can be used in web server&client, in DB such as ReactiveMongo and also when communicating between servers with WebServices.

So I keep exploring what can be done with Json (specially in the context of PlayFramework reactive architecture) and building the tools that are required to concretize my ideas.

My last article introduced JsPath Pattern Matching and I told you that I needed this tool to use it with JsZipper. It’s time to use it…

Here is why I want to do:

  • Build dynamically a Json structure by aggregating data obtained by calling several external WS such as twitterAPI or github API or whatever API.
  • Build this structure from a Json template stored in MongoDB in which I will find the URL and params of WebServices to call.
  • Use Play2.1/WS & ReactiveMongo reactive API meaning resulting Json should be built in an asynchronous and non-blocking way.
  • Use concept of JsZipper introduced in my previous article to be able to modify efficiently Play2.1/Json immutable structures.

Please note that this idea and its implementation is just an exercise of style to study the idea and introduce technical concepts but naturally it might seem a bit fake. Moreover, keep in mind, JsZipper API is still draft…


The idea of Json template

Imagine I want to gather twitter user timeline and github user profile in a single Json object.

I also would like to:

  • configure the URL of WS and query parameters to fetch data
  • customize the resulting Json structure

Let’s use a Json template such as:

1
2
3
4
5
6
7
8
9
10
11
12
{
  "streams" : {
    "twitter" : {
      "url" : "http://localhost:9000/twitter/statuses/user_timeline",
      "user_id" : "twitter_nick"
    },
    "github" : {
      "url" : "http://localhost:9000/github/users",
      "user_id" : "github_nick"
    }
  }
}

Using the url and user_id found in __\streams\twitter, I can call twitter API to fetch the stream of tweets and the same for__\streams\github`. Finally I replace the content of each node as following:

1
2
3
4
5
6
7
8
9
10
{
  "streams" : {
    "twitter" : {
      // TWITTER USER TIMELINE HERE
    },
    "github" : {
      // GITHUB USER PROFILE HERE
    }
  }
}

Moreover, I’d like to store multiple templates like previous sample with multiple user_id to be able to retrieve multiple streams at the same time.


Creating Json template in Play/ReactiveMongo (v0.9)

Recently, Stephane Godbillon has released ReactiveMongo v0.9 with corresponding Play plugin. This version really improves and eases the way you can manipulate Json directly with Play & Mongo from Scala.

Let’s store a few instance of previous templates using this API:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
// gets my mongo collection
def coll = db.collection[JSONCollection]("templates")

def provision = Action { Async {
  val values = Enumerator(
    Json.obj(
      "streams" -> Json.obj(
        "twitter" -> Json.obj(
          "url" -> "http://localhost:9000/twitter/statuses/user_timeline",
          "user_id" -> "twitter_nick1"
        ),
        "github" -> Json.obj(
          "url" -> "http://localhost:9000/github/users",
          "user_id" -> "github_nick1"
        )
      )
    ),
    ... more templates
  )

  coll.bulkInsert(values).map{ nb =>
    Ok(Json.obj("nb"->nb))
  }

} }

Hard isn’t it?

Note that I use localhost URL because with real Twitter/Github API I would need OAuth2 tokens and this would be a pain for this sample :)



Reactive Json crafting

Now, let’s do the real job i.e the following steps:

  • retrieve the template(s) from Mongo using ReactiveMongo JsonCollection
  • call the WebServices to fetch the data using Play Async WS
  • update the Json template(s) using Monadic JsZipper JsZipperM[Future]

The interesting technical points here are that:

  • ReactiveMongo is async so we get Future[JsValue]
  • Play/WS is Async so we get also Future[JsValue]
  • We need to call multiple WS so we have a Seq[Future[JsValue]]

We could use Play/Json transformers presented in a previous article but knowing that you have to manage Futures and multiple WS calls, it would create quite complicated code.

Here is where Monadic JsZipper becomes interesting:

  • JsZipper allows modifying immutable JsValue which is already cool

  • JsZipperM[Future] allows modifying JsValue in the future and it’s even better!

Actually the real power of JsZipper (besides being able to modify/delete/create a node in immutable Json tree) is to transform a Json tree into a Stream of nodes that it can traverse in depth, in width or whatever you need.


Less code with WS sequential calls

Here is the code because you’ll see how easy it is:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
// a helper to call WS
def callWSFromTemplate(value: JsValue): Future[JsValue] =
  WS.url((value \ "url").as[String])
    .withQueryString( "user_id" -> (value \ "user_id").as[String] )
    .get().map{ resp => resp.json }

// calling WS sequentially
def dataSeq = Action{
  Async{
    for{
      templates <- coll.find(Json.obj()).cursor[JsObject].toList   // retrieves templates from Mongo
      updated   <- Json.toJson(templates).updateAllM{
        case (_ \ "twitter", value) => callWSFromTemplate(value)
        case (_ \ "github", value)  => callWSFromTemplate(value)
        case (_, value)             => Future.successful(value)
      }
    } yield Ok(updated)
  }
}

Please note:

  • Json.toJson(templates) transforms a List[JsObject] into JsArray because we want to manipulate pure JsValue with JsZipperM[Future].

  • .updateAllM( (JsPath, JsValue) => Future[JsValue] ) is a wrapper API hiding the construction of a JsZipperM[Future]: once built, the `JsZipperM[Future] traverses the Json tree and for each node, it calls the provided function flatMapping on Futures before going to next node. This makes the calls to WS sequential and not parallel.

  • case (_ \ "twitter", value) : yes here is the JsPath pattern matching and imagine the crazy stuff you can do mixing Json traversal and pattern matching ;)

  • Async means the embedded code will return Future[Result] but remember that it DOESN’T mean the Action is synchronous/blocking because in Play, everything is Asynchronous/non-blocking by default.

Then you could tell me that this is cool but the WS are not called in parallel but sequentially. Yes it’s true but imagine that it’s less than 10 lines of code and could even be reduced. Yet, here is the parallelized version…


Parallel WS calls

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
def dataPar = Action{
  Async{
    coll.find(Json.obj()).cursor[JsObject].toList.flatMap{ templates =>
      // converts List[JsObject] into JsArray
      val jsonTemplates = Json.toJson(templates)

      // gathers all nodes that need to be updated
      val nodes = jsonTemplates.findAll{
        case (_ \ "twitter", _) | (_ \ "github", _) => true
        case (_, value) => false
      }

      // launches WS calls in parallel and updates original JsArray
      Future.traverse(nodes){
        case (path@(_ \ "twitter"), value) => callWSFromTemplate(value).map( resp => path -> resp )
        case (path@(_ \ "github"), value)  => callWSFromTemplate(value).map( resp => path -> resp )
      }.map{ pathvalues => Ok(jsonTemplates.set(pathvalues:_*)) }
    }
  }
}

Note that:

  • jsonTemplates.findAll( filter: (JsPath, JsValue) => Boolean ) traverses the Json tree and returns a Stream[(JsPath, JsValue)] containing the filtered nodes. This is not done with Future because we want to get all nodes now to be able to launch all WS calls in parallel.

  • Future.traverse(nodes)(T => Future[T]) traverses the filtered values and calls all WS in parallel.

  • case (path@(_ \ "twitter"), value) is just JsPath pattern matching once again keeping track of full path to be able to return it with the value path -> resp for next point.

  • jsonTemplates.set( (JsPath, JsValue)* ) finally updates all values at given path. Note how easy it is to update multiple values at multiple paths.

A bit less elegant than the sequential case but not so much.



Conclusion

This sample is a bit stupid but you can see the potential of mixing those different tools together.

Alone, JsZipper and JsPath pattern matching provides very powerful ways of manipulating Json that Reads/Writes can’t do easily.

When you add reactive API on top of that, JsZipper becomes really interesting and elegant.

The sample app can be found on Github here

Have JsZipperM[fun]!





EXPERIMENTAL / DRAFT


While experimenting Play21/Json Zipper in my previous article, I needed to match patterns on JsPath and decided to explore a bit this topic.

This article just presents my experimentations on JsPath pattern matching so that people interested in the topic can tell me if they like it or not and what they would add or remove. So don’t hesitate to let comments about it.

If the result is satisfying, I’ll propose it to Play team ;)

Let’s go to samples as usual.

Very simple pattern matching

match/scale-style

1
2
3
4
5
scala> __ \ "toto" match {
  case __ \ key => Some(key)
  case _ => None
}
res0: Option[String] = Some(toto)

val-style

1
2
scala> val _ \ toto = __ \ "toto"
toto: String = toto

Note that I don’t write val __ \ toto = __ \ "toto" (2x Underscore) as you would expect.

Why? Let’s write it:

1
2
3
scala> val __ \ toto = __ \ "toto"
<console>:20: error: recursive value x$1 needs type
val __ \ toto = __ \ "toto"

Actually, 1st __ is considered as a variable to be affected by Scala compiler. Then the variable __ appears on left and right side which is not good.

So I use _ to ignore its value because I know it’s __. If I absolutely wanted to match with __, you would have written:

1
2
scala> val JsPath \ toto = __ \ "toto"
toto: String = toto

Pattern matching with indexed path

1
2
3
4
5
6
7
8
9
scala> val (_ \ toto)@@idx = (__ \ "toto")(2)
toto: String = toto
idx: Int = 2

scala> (__ \ "toto")(2) match {
  case (__ \ "toto")@@idx => Some(idx)
  case _      => None
}
res1: Option[Int] = Some(2)

Note the usage of @@ operator that you can dislike. I didn’t find anything better for now but if anyone has a better idea, please give it to me ;)


Pattern matching the last element of a JsPath

1
2
scala> val _ \ last = __ \ "alpha" \ "beta" \ "delta" \ "gamma"
last: String = gamma

Using _, I ignore everything before gamma node.


Matching only the first element and the last one

1
2
3
4
5
6
7
8
scala> val _ \ first \?\ last = __ \ "alpha" \ "beta" \ "gamma" \ "delta"
first: String = alpha
last: String = delta

scala> val (_ \ first)@@idx \?\ last = (__ \ "alpha")(2) \ "beta" \ "gamma" \ "delta"
first: String = alpha
idx: Int = 2
last: String = delta

Note the \?\ operator which is also a temporary choice: I didn’t want to choose \\ ause \?\ operator only works in the case where you match between the first and the last element of the path and not between anything and anything…


A few more complex cases

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
scala> val (_ \ alpha)@@idx \ beta \ gamma \ delta = (__ \ "alpha")(2) \ "beta" \ "gamma" \ "delta"
alpha: String = alpha
idx: Int = 2
beta: String = beta
gamma: String = gamma
delta: String = delta

scala> val (_ \ alpha)@@idx \ _ \ _ \ delta = (__ \ "alpha")(2) \ "beta" \ "gamma" \ "delta"
alpha: String = alpha
idx: Int = 2
delta: String = delta

scala> val _@@idx \?\ gamma \ delta = (__ \ "alpha")(2) \ "beta" \ "gamma" \ "delta"
idx: Int = 2
gamma: String = gamma
delta: String = delta

scala> (__ \ "alpha")(2) \ "beta" \ "gamma" \ "delta" match {
  case _@@2 \ "beta" \ "gamma" \ _ => true
  case _ => false
}
res4: Boolean = true

And finally using regex?

1
2
3
4
5
6
7
8
scala> val pattern = """al(\d)*pha""".r
pattern: scala.util.matching.Regex = al(\d)*pha

scala> (__ \ "foo")(2) \ "al1234pha" \ "bar" match {
  case (__ \ "foo")@@idx \ pattern(_) \ "bar" => true
  case _ => false
}
res6: Boolean = true

So, I think we can provide more features and now I’m going to use it with my JsZipper stuff in my next article ;)

If you like it, tell it!

Have fun!





EXPERIMENTAL / DRAFT


The code is available on Github project play-json-zipper

JsZipper is a new tool allowing much more complex & powerful manipulations of Json structures for Play2/Json Scala API (not a part of Play2 core for now)

JsZipper is inspired by the Zipper concept introduced by Gérard Huet in 1997.

The Zipper allows to update immutable traversable structures in an efficient way. Json is an immutable AST so it fits well. FYI, the Zipper behaves like a loupe that walks through each node of the AST (left/right/up/down) while keeping aware of the nodes on its left, its right and its upper. The interesting idea behind the loupe is that when it targets a node, it can modify and even delete the focused node. The analogy to the pants zipper is quite good too because when it goes down the tree, it behaves as if it was opening the tree to be able to drive the loupe through all nodes and when it goes up, it closes back the tree… I won’t tell more here, it would be too long.

JsZipper is a specific interpretation of Zipper concept for Play/Json API based on :

  • Scala Streams to go through / update / construct Json AST nodes in a lazy way
  • Monadic aspects to provide funnier ways of manipulating the Json AST (plz see below)


Please note, JsZipper is not an end in itself but a tool useful to provide new API to manipulate Json.

Let’s go to samples because it explains everything.

We’ll use following Json Object.

1
2
3
4
5
6
7
8
9
10
11
scala> val js = Json.obj(
  "key1" -> Json.obj(
    "key11" -> "TO_FIND",
    "key12" -> 123L,
    "key13" -> JsNull
  ),
  "key2" -> 123,
  "key3" -> true,
  "key4" -> Json.arr("TO_FIND", 345.6, "test", Json.obj("key411" -> Json.obj("key4111" -> "TO_FIND")))
)
js: play.api.libs.json.JsObject = {"key1":{"key11":"TO_FIND","key12":123,"key13":null},"key2":123,"key3":true,"key4":["TO_FIND",345.6,"test",{"key411":{"key4111":"TO_FIND"}}]}

Basic manipulations

Setting multiple paths/values

1
2
3
4
5
scala> js.set(
  (__ \ "key4")(2) -> JsNumber(765.23),
  (__ \ "key1" \ "key12") -> JsString("toto")
)
res1: play.api.libs.json.JsValue = {"key1":{"key11":"TO_FIND","key12":"toto","key13":null},"key2":123,"key3":true,"key4":["TO_FIND",345.6,765.23,{"key411":{"key4111":"TO_FIND"}}]}

Deleting multiple paths/values

1
2
3
4
5
6
scala> js.delete(
  (__ \ "key4")(2),
  (__ \ "key1" \ "key12"),
  (__ \ "key1" \ "key13")
)
res2: play.api.libs.json.JsValue = {"key1":{"key11":"TO_FIND"},"key2":123,"key3":true,"key4":["TO_FIND",345.6,{"key411":{"key4111":"TO_FIND"}}]}

Finding paths/values according to a filter

1
2
3
4
5
6
scala> js.findAll( _ == JsString("TO_FIND") ).toList
res5: List[(play.api.libs.json.JsPath, play.api.libs.json.JsValue)] = List(
  (/key1/key11,"TO_FIND"),
  (/key4(0),"TO_FIND"),
  (/key4(3)/key411/key4111,"TO_FIND")
)

Updating values according to a filter based on value

1
2
3
4
5
scala> js.updateAll( (_:JsValue) == JsString("TO_FIND") ){ js =>
  val JsString(str) = js
  JsString(str + "2")
}
res6: play.api.libs.json.JsValue = {"key1":{"key11":"TO_FIND2","key12":123,"key13":null},"key2":123,"key3":true,"key4":["TO_FIND2",345.6,"test",{"key411":{"key4111":"TO_FIND2"}}]}

Updating values according to a filter based on path+value

1
2
3
4
5
6
7
scala> js.updateAll{ (path, js) =>
  JsPathExtension.hasKey(path) == Some("key4111")
}{ (path, js) =>
  val JsString(str) = js
  JsString(str + path.path.last)
}
res1: play.api.libs.json.JsValue = {"key1":{"key11":"TO_FIND","key12":123,"key13":null},"key2":123,"key3":true,"key4":["TO_FIND",345.6,"test",{"key411":{"key4111":"TO_FIND/key4111"}}]}

Creating an object from scratch

1
2
3
4
5
6
7
scala> val build = JsExtensions.buildJsObject(
  __ \ "key1" \ "key11" -> JsString("toto"),
  __ \ "key1" \ "key12" -> JsNumber(123L),
  (__ \ "key2")(0)      -> JsBoolean(true),
  __ \ "key3"           -> Json.arr(1, 2, 3)
)
build: play.api.libs.json.JsValue = {"key1":{"key11":"toto","key12":123},"key3":[1,2,3],"key2":[true]}

Let’s be funnier with Monads now

Let’s use Future as our Monad because it’s… coooool to do things in the future ;)

Imagine you call several services returning Future[JsValue] and you want to build/update a JsObject from it. Until now, if you wanted to do that with Play2/Json, it was quite tricky and required some code.

Here is what you can do now.

Updating multiple FUTURE values at given paths

1
2
3
4
5
6
7
8
scala> val maybeJs = js.setM[Future](
  (__ \ "key4")(2)        -> future{ JsNumber(765.23) },
  (__ \ "key1" \ "key12") -> future{ JsString("toto") }
)
maybeJs: scala.concurrent.Future[play.api.libs.json.JsValue] = scala.concurrent.impl.Promise$DefaultPromise@6beb722d

scala> Await.result(maybeJs, Duration("2 seconds"))
res4: play.api.libs.json.JsValue = {"key1":{"key11":"TO_FIND","key12":"toto","key13":null},"key2":123,"key3":true,"key4":["TO_FIND",345.6,765.23,{"key411":{"key4111":"TO_FIND"}}]}

Update multiple FUTURE values according to a filter

1
2
3
4
5
6
7
8
9
10
scala> val maybeJs = js.updateAllM[Future]( (_:JsValue) == JsString("TO_FIND") ){ js =>
  future {
    val JsString(str) = js
    JsString(str + "2")
  }
}
maybeJs: scala.concurrent.Future[play.api.libs.json.JsValue] = scala.concurrent.impl.Promise$DefaultPromise@35a4bb1a

scala> Await.result(maybeJs, Duration("2 seconds"))
res6: play.api.libs.json.JsValue = {"key1":{"key11":"TO_FIND2","key12":123,"key13":null},"key2":123,"key3":true,"key4":["TO_FIND2",345.6,"test",{"key411":{"key4111":"TO_FIND2"}}]}

Creating a FUTURE JsArray from scratch

1
2
3
4
5
6
7
8
scala> val maybeArr = JsExtensions.buildJsArrayM[Future](
  future { JsNumber(123.45) },
  future { JsString("toto") }
)
maybeArr: scala.concurrent.Future[play.api.libs.json.JsValue] = scala.concurrent.impl.Promise$DefaultPromise@220d48e4

scala> Await.result(maybeArr, Duration("2 seconds"))
res0: play.api.libs.json.JsValue = [123.45,"toto"]

It’s still draft so it can be improved but if you like it, don’t hesitate to comment and if people like it, it could become a part of Play Framework itself

Have fun!





The question

What’s the first word coming in your mind when I say:

“Most basic concept of functional programming?”

For info, this dendrograph was pre-computed using Play2.1 app sucking Tweets & filtering/grouping the results in a very manual-o-matic way…

Have Fun(ctional)