Adding Scala 3 Support to Blindsight
I just released Blindsight 1.5.2 with Scala 3 support.
Scala 3, aka Dotty is still very new, and there are some wrinkles involved in setting up support for a project with cross version support, so this is my blog post / notes for how to set things up.
Blindsight is at once a good and a bad project to get started with, because it has few underlying dependencies, but does have macro dependencies that poke at some areas of Scala 3 that are underdocumented.
IDE Support
The first thing is IDE Support. As of July 2021, IntelliJ IDEA doesn't deal well with two different versions of Scala, even in the nightly Scala plugins. You're likely to get red-lines and confusion, especially when dealing with Scala 3 macros.
Visual Studio Code with the Metals plugin is much more effective at parsing out Scala 3 and Scala 2 in the same project, especially with Metals 0.10.5, and so I ended up using that primarily. It installs bloop and some other stuff behind the scenes, but it doesn't require that you tweak it.
Traversing Scala 3 code for implementations and definitions is doable in VS Code, but it breaks down when using Scala 3 macros. I found that the best way to reference the code was to checkout the Dotty source code in another VS Code window and search by hand. (Ironically, Metals doesn't like compiling Dotty because it tries to bootstrap itself. Oh well.)
SBT and CI
I upgraded to sbt 1.5.5 which helped with some cross compilation issues, but ultimately I went with sbt-projectmatrix to implement cross-building at the project level. As a bonus, cross-building now happens in parallel.
Somewhere along the line Travis CI stopped working, so I moved to Github Actions using sbt-github-actions as a template.
Setting up a ci.yml file was pretty straightforward. A bit tricky when it came to setting up a continous integration task, but leveraging sbt-commandmatrix to manage the dimensions using BUILD_KEY: $-$
.
The only downside from my perspective is that using sbt-projectmatrix
uses more memory, enough to reliably cause out of memory errors when releasing. Throwing more memory at the problem makes it go away.
export SBT_OPTS="-Xms512M -Xmx4g -Xss2M -XX:MaxMetaspaceSize=1024M"
Tooling
I needed to set up .scalafmt.conf with some extra sections to handle scala3 code differently:
version = 3.0.0-RC6
fileOverride {
"glob:**/api/src/main/scala-3/**" {
runner.dialect = scala3
}
"glob:**/inspections/src/test/scala-3/**" {
runner.dialect = scala3
}
"glob:**/inspections/src/main/scala-3/**" {
runner.dialect = scala3
}
}
Library Incompatibilities
Setting up the different scalac options and different libraries in build.sbt
was not anything new, but I do wish that sbt had something built in for this.
For scalac options, it comes down to a custom def:
def scalacOptionsVersion(scalaVersion: String): Seq[String] = {
(CrossVersion.partialVersion(scalaVersion) match {
case Some((3, n)) =>
case Some((2, n)) if n >= 13 =>
case Some((2, n)) if n == 12 =>
case Some((2, n)) if n == 11 =>
})
}
and then you can just pass it in:
lazy val logstash = (projectMatrix in file("logstash"))
.settings(AutomaticModuleName.settings("com.tersesystems.blindsight.logstash"))
.settings(
name := "blindsight-logstash",
scalacOptions := scalacOptionsVersion(scalaVersion.value)
// ...
)
Also, the scala-reflect
library isn't used in Scala 3:
libraryDependencies ++= {
CrossVersion.partialVersion(scalaVersion.value) match {
case Some((3, 0)) => Seq.empty
case _ =>
Seq(
"org.scala-lang" % "scala-reflect" % scalaVersion.value
)
}
},
Interpolation
Blindsight has a statement interpolation feature that works in the same way as Scala's s"${foo}"
interpolation. Although the Scala documentation doesn't mention it explicitly, string interpolation and StringContext
are based on Scala's macro system, which means that it's different in Scala 3.
Scala 3 macros are based on quotes and splices, along with a typed AST model that hides the untyped terms and trees. I struggled with this initially, and finally found the interpolation specs in Dotty that sorted out what was happening.
For reference, here's the Scala 2 interpolation compared to the Scala 3 interpolation.
I did make a very awkward discovery. In Scala 2, I could wrap a ToArgument
around something and the compiler would happily find an appropriate implicit type class for it. That doesn't happen in Scala 3: implicits are only summoned for the precise type. In order to find something appropriate, I needed to explicitly walk up the type tree until I found something:
def summonArgument[T: Type](expr: Expr[T]): Option[Expr[Argument]] = {
// this is an Ident("arg") but we need to find the type
// this will work when we ascribe the type as ${foo: Foo}
// but implicit search doesn't walk down the type to its
// companion object! So, we walk down the type tree ourselves
// until we find something.
val option = Expr.summon[ToArgument[T]].map { toArg =>
'{ $toArg.toArgument($expr) }
}
option.orElse {
val originalType = TypeRepr.of[T]
val widenedType = originalType.widen
if (widenedType == originalType) {
// report.error(s"Cannot find a ToArgument type class for ${originalType.show}!")
None
} else {
widenedType.asType match {
case '[t] =>
val widenedExpr = '{ $expr.asInstanceOf[t] }
summonArgument(widenedExpr)
}
}
}
}
Inspection Macros
Then there's the biggie, inspection macros. I wrote a bit about these here, but the basic upshot is that you can rewrite the AST to do debug logging:
val fn = { dval: ValDefInspection =>
logger.debug(s"${dval.name} = ${dval.value}")
}
decorateVals(fn) {
val a = 5
val b = 15
a + b
}
or dump the public fields of an object:
class ExampleClass(val someInt: Int) {
protected val protectedInt = 22
}
val exObj = new ExampleClass(42)
val publicFields = dumpPublicFields(exObj)
logger.debug(s"public fields = $publicFields") // will print ("someInt", 42)
I first did this by breaking it out into a different project, scala3-inspections, so that I wouldn't have to futz with cross-compilation. I leaned heavily on Macro Tips and Tricks and Enhancing DynamoDb client with Scala 3 for a bunch of it, and kept Quotes.scala bookmarked in my browser. Eventually, I got it all working together.
Here's the Scala 3 version compared against the Scala 2 version.
There's two modes, typed and untyped, and most of the fun in metaprogramming is determining how to flip between them, mostly using asTerm
and asExprOf[T]
. Everything untyped is a Tree
, but most of the time you'll look at Term
or Statement
because the Tree
is the base class of all of it. You'll use pattern matching and unapply to pull data out of the tree.
Using inline
changes the code in unintuitive ways. For example, getting at the contents of a block was not working until a suggestion was made to change
inline def decorateVals[A](output: ValDefInspection => Unit)(block: A): A
to add an inline
to the block:
inline def decorateVals[A](output: ValDefInspection => Unit)(inline block: A): A
After that, I was able to insert statements pretty easily:
def decorateValsImpl[A: Type](output: Expr[ValDefInspection => Unit], block: Expr[A])(using Quotes): Expr[A] = {
import quotes.reflect.*
def rewriteBlock(data: Term): Term = {
data match {
case Block(stmts, expr) =>
val newStmts = stmts.flatMap(rewriteStatement)
Block(newStmts, expr)
}
}
def rewriteStatement(statement: Statement): List[Statement] = {
statement match {
case valdef: ValDef =>
val termExpr: Expr[String] = Expr(valdef.name)
val termRef = TermRef(valdef.tpt.tpe, valdef.name)
val identExpr = Ref(valdef.symbol).asExpr
val inspection: Term = '{ $output(ValDefInspection($termExpr, $identExpr)) }.asTerm
List(valdef, inspection)
case other =>
List(other)
}
}
block.asTerm match {
case Inlined(emptyTree, emptyList, statementBlock) =>
val newBlock = rewriteBlock(statementBlock)
Inlined(emptyTree, emptyList, newBlock).asExprOf[A]
case _ =>
block
}
}
One thing that still confuses me is how Ref
, Ident
, and Select
are related. For example, Ident(termRef)
doesn't return an Ident
. Instead, Ident.apply
returns a Term
, which can be either an Ident
or a Select
:
trait IdentModule { this: Ident.type =>
def apply(tmref: TermRef): Term
// ...
}
But a Ref
takes a Symbol
or a TermRef
to return a Ref
:
trait RefModule { this: Ref.type =>
def term(tp: TermRef): Ref
def apply(sym: Symbol): Ref
}
It's not clear to me when I should use one over another, and how they're different. I did eventually get something working, but I'm pretty sure it was a combination of luck and copying.
On the plus side, Scala 3 macros are a huge improvement over quasiquotes because they can be checked at compile time. If I did anything wrong at all, the compiler would usually have a good error message for me.
On the AST side of things, there are some tools for traversing and transforming trees, the tree utilities. These come down to TreeAccumulator
, TreeTraverser
, and TreeMap
. You'll want to look at the source code for all three. Eugene wrote a post recently on sudori part 1 that discusses TreeMap
. transformTerm
is more useful than transformTree
because you have a richer set of operations. I ended up not using TreeMap
because it was "too much gun" for what I wanted to do, but it is very helpful to see a class that decomposes the AST into its component parts.
Conclusion
All in all, I think Scala 3 is a solid improvement. If you're an end user, then I can see it being relatively painless. If you're a library owner, then the pain is in cross compilation and CI but it's about the same as going from 2.12
to 2.13
. Macros are hard because there's no incremental system available, and there's no scalafix tool that will rewrite your macros to be compatible, but working with quoting and splicing is much, much better than quasiquotes and the AST just needs more documentation and examples.