Riding the Scalawave in 2016

Scala and Akka are mind-bending in this conference overview from Gdańsk, Poland.

photo of Patryk Koryzna
Patryk Koryzna

Software Engineer

Posted on Feb 15, 2017

"Do not try and bend the spoon, that's impossible. Instead, only try to realize the truth... there is no spoon. Then you will see it is not the spoon that bends, it is only yourself."

This classic sci-fi quote seems to be quite a fitting summary of the workshop and talks I attended at Scalawave last November in Gdańsk, Poland. But instead of the spoon, there's Scala. And let me tell you, it gives you a lot of occasions to bend your mind. Let's go!

A workshop workout

The day before the conference was dedicated to workshops. I chose to participate in type-level (meta)programming using Shapeless. If it sounds complicated to you, then you would be absolutely right.

Let me deconstruct this workshop title for you: The “type level” part is implying that it’s concerned with operating on the types of values used by computations of your Scala programs, in opposition to the regular value level meaning. “Metaprogramming” literally means a level above your typical programs. So what this boils down to is programming software that manipulates software, or in other words, writing little programs that write other programs. What?

If it sounds completely alien (or philosophical even), you’d be surprised to know that you might’ve been using this method already without even knowing about it. Chances are you’ve used a Scala library that has generated some serialisation formats (say, JSONs) during compile time. Such libraries use the advanced type system of the Scala language (and/or some macro magic for some specific information not provided by types alone) to generate code and compile-time that otherwise would have to be written by hand or by using reflection – and no-one wants to write those JsObjects by hand. These abstractions can also help to overcome some limits of a standard library (e.g. the 22 argument limit for functions).

Piotr Krzemiński, who led the workshop, took participants on a great, four hour long journey through the facilities available in Shapeless. We began with a quick introduction into the language features making it possible – namely implicit mechanisms, type members, path dependent types and more – and then moved over to the big league.

The first exercise required participants to write some very simple Peano arithmetic: defining integers.[1] The real challenge was to implement addition and comparisons, which required recalling some long-unused knowledge back from your university days. Oh, and remember, this is type-level! No recursive functions were used – the actual calculation (ab)uses implicit resolution and compile time. Inefficient? Of course, but the goal is not for some twisted type of premature optimisation.

The next example demonstrated how the type-level numbers allow you to create a type that also includes information about the length of a vector. This might actually come in handy if you want to guarantee that a correct number of elements is passed to and/or returned by your functions.

The following part of the workshop took us closer to everyday challenges: exploring mechanisms needed for automatic encoder derivation. Enter HList – a heterogenous list, which can be compared to either a list of different types of elements (and keeping this type information, instead of being coerced to lowest upper bound, like Any or Product with Serializable) or an arbitrary-length tuple.

This is more powerful than you might think. Using Generic, you can convert a case class to an HList, keeping type information. You can then add a field name using some type-level trickery and dark magic[2], and you have everything you need to make your JSON/Protobuf/??? de-/encoder ready to accept any case class without using any accessors or tightly coupling it to a concrete class.

However, in my opinion the most impressive part is that there aren't really many macros in play here. Most of the code is just using regular old Scala syntax, so you're not really changing the syntax (or bending the rules). Instead, you're bending your mind while using whatever the compiler can already do for you.

A side note: runtime reflection, such as JVM’s familiar getClass, etc. is also a form of metaprogramming – you’re literally writing a program that knows about the program that’s running. This approach is different however, because it can only introspect the code that has already been compiled, loaded, and is currently running. This has performance implications and is less safe (and less elegant in the case of Java, if you ask me).

Getting a good talking to

The next day was dedicated to talks, taking place in Stary Maneż (Old Manège) – a very impressive venue on its own. The conference’s first keynote talk was by Roland Kuhn, who presented on Distributed Systems and Composability, mentioning ways of describing distributed computations in a mathematical way, such as pi calculus (think lambda calculus with channels) and Scribble, a "language to describe application-level protocols among communicating systems". One of the biggest complaints about distributed applications is the unpredictability present, due to network errors, asynchronicity and more – these ideas definitely seem like a step in the right direction. And of course there is Akka Typed, providing a higher level of type safety.[3]

Next up was Jon Pretty's "Interpolating Strings Like a Boss", exploring the humble string interpolator mechanism in Scala. There were some surprising conclusions in this talk – did you know that you can actually create a class named StringContext and provide your own implementation? It turns out that it is just a simple text substitution performed by the compiler during the desugaring phase. Also, this can happen[4]. This shows how simple, everyday features have absolutely crazy corner cases. Throw a few macros into the mix and you're already getting into mind-bending territory.

Another talk I would like to mention was given by Jan Pustelnik about Reactive Streams for fast data processing. This being a Scala conference, I put this into the context of using Akka Streams, which we do a lot here at Zalando Tech. Being familiar with these, the highlight for me was that stream processing your data is not a new idea at all. In fact, it's almost as old computers themselves – back in the day, when RAM was just a bunch of magnets on wires sewn together by hand, you didn't really have too much space to spare. Hence, streaming algorithms were the way to go. And in the words of the author, the talk did "mix obscure algorithms found in dust-covered textbooks with the hottest newest features from the Streams ecosystem". The old is new again in a sense here, because in the place of spinning tape streaming the data byte-by-byte, we now have blazingly fast SSDs with gigabit connections. On top of that, rather than a rack sized single CPU, we have multicore processors running tens of threads. We also have the luxury of GraphStage DSL rather than physically flipping switches or punching holes in cardboard.

Final thoughts

But what's so mind-bending here you ask? Consider how many of the above mentioned themes seem so simple on the surface: Akka? Just sending messages! Yes, until you distribute your actors across machines (also not impossible to screw up on one machine when you're just a little less than careful). String interpolators? Just intersperse a few strings together, what’s the big deal? But then you have libraries that use this for much, much more than just a nicer printf, for example Slick or ScalikeJDBC, who use it for creating SQL statements. Akka Streams? You can just map all the things and call it a day. Yet you can also write your own graph stage to do fan-out, fan-in, stateful flows and more – not to mention the machinery under the hood that interprets the graph you’ve built and runs it on actors.

Shapeless? Well, Shapeless is mind-bending by definition – you can't really shrug it off that easily.

While you may say that all of this is just layers of abstractions, as old as programming itself, I definitely agree. But looking at how all of these things are achieved in mostly normal, run-of-the-mill Scala, without any fancy compiler plugins, or with help of a macro (written in Scala nevertheless) is in my opinion truly mind blowing. Do you agree? Let me know via Twitter at @pkoryzna.


[1] Using mathematical induction - that is, beginning with the edge case being zero, and then defining other integers as successors of each other, i.e. One = Successor(Zero), Two = Successor(One), etc. [2] Singleton typed symbols generated with macros [3] A popular joke in some circles was that the flagship product of a company called Typesafe was based on Any => Unit method [4] Anything that you write before your string literal becomes a method invocation on StringContext instance - in this case, it's calling the equals method on the StringContext



Related posts