Object-Oriented Meets Functional

Have the best of both worlds. Construct elegant class hierarchies for maximum code reuse and extensibility, implement their behavior using higher-order functions. Or anything in-between.

Learn More


Scala began life in 2003, created by Martin Odersky and his research group at EPFL, next to Lake Geneva and the Alps, in Lausanne, Switzerland. Scala has since grown into a mature open source programming language, used by hundreds of thousands of developers, and is developed and maintained by scores of people all over the world.
Download API Docs    

Scala in a Nutshell

 click the boxes below to see Scala in action! 

Seamless Java Interop

Scala runs on the JVM, so Java and Scala stacks can be freely mixed for totally seamless integration.

Type Inference

So the type system doesn’t feel so static. Don’t work for the type system. Let the type system work for you!

& Distribution

Use data-parallel operations on collections, use actors for concurrency and distribution, or futures for asynchronous programming.


Combine the flexibility of Java-style interfaces with the power of classes. Think principled multiple-inheritance.

Pattern Matching

Think “switch” on steroids. Match against class hierarchies, sequences, and more.

Higher-Order Functions

Functions are first-class objects. Compose them with guaranteed type safety. Use them anywhere, pass them to anything.

class Author(val firstName: String,
    val lastName: String) extends Comparable[Author] {

  override def compareTo(that: Author) = {
    val lastNameComp = this.lastName compareTo that.lastName
    if (lastNameComp != 0) lastNameComp
    else this.firstName compareTo that.firstName

object Author {
  def loadAuthorsFromFile(file: java.io.File): List[Author] = ???
import static scala.collection.JavaConversions.asJavaCollection;

public class App {
    public List<Author> loadAuthorsFromFile(File file) {
        return new ArrayList<Author>(asJavaCollection(

    public void sortAuthors(List<Author> authors) {

    public void displaySortedAuthors(File file) {
        List<Author> authors = loadAuthorsFromFile(file);
        for (Author author : authors) {
                author.lastName() + ", " + author.firstName());

Combine Scala and Java seamlessly

Scala classes are ultimately JVM classes. You can create Java objects, call their methods and inherit from Java classes transparently from Scala. Similarly, Java code can reference Scala classes and objects.

In this example, the Scala class Author implements the Java interface Comparable<T> and works with Java Files. The Java code uses a method from the companion object Author, and accesses fields of the Author class. It also uses JavaConversions to convert between Scala collections and Java collections.

Type inference
scala> class Person(val name: String, val age: Int) {
     |   override def toString = s"$name ($age)"
     | }
defined class Person

scala> def underagePeopleNames(persons: List[Person]) = {
     |   for (person <- persons; if person.age < 18)
     |     yield person.name
     | }
underagePeopleNames: (persons: List[Person])List[String]

scala> def createRandomPeople() = {
     |   val names = List("Alice", "Bob", "Carol",
     |       "Dave", "Eve", "Frank")
     |   for (name <- names) yield {
     |     val age = (Random.nextGaussian()*8 + 20).toInt
     |     new Person(name, age)
     |   }
     | }
createRandomPeople: ()List[Person]

scala> val people = createRandomPeople()
people: List[Person] = List(Alice (16), Bob (16), Carol (19), Dave (18), Eve (26), Frank (11))

scala> underagePeopleNames(people)
res1: List[String] = List(Alice, Bob, Frank)

Let the compiler figure out the types for you

The Scala compiler is smart about static types. Most of the time, you need not tell it the types of your variables. Instead, its powerful type inference will figure them out for you.

In this interactive REPL session (Read-Eval-Print-Loop), we define a class and two functions. You can observe that the compiler infers the result types of the functions automatically, as well as all the intermediate values.

val x = future { someExpensiveComputation() }
val y = future { someOtherExpensiveComputation() }
val z = for (a <- x; b <- y) yield a*b
for (c <- z) println("Result: " + c)
println("Meanwhile, the main thread goes on!")

Go Concurrent or Distributed with Futures & Promises

In Scala, futures and promises can be used to process data asynchronously, making it easier to parallelize or even distribute your application.

In this example, the future{} construct evaluates its argument asynchronously, and returns a handle to the asynchronous result as a Future[Int]. For-comprehensions can be used to register new callbacks (to post new things to do) when the future is completed, i.e., when the computation is finished. And since all this is executed asynchronously, without blocking, the main program thread can continue doing other work in the meantime.

abstract class Spacecraft {
  def engage(): Unit
trait CommandoBridge extends Spacecraft {
  def engage(): Unit = {
    for (_ <- 1 to 3)
  def speedUp(): Unit
trait PulseEngine extends Spacecraft {
  val maxPulse: Int
  var currentPulse: Int = 0
  def speedUp(): Unit = {
    if (currentPulse < maxPulse)
      currentPulse += 1
class StarCruiser extends Spacecraft
                     with CommandoBridge
                     with PulseEngine {
  val maxPulse = 200

Flexibly Combine Interface & Behavior

In Scala, multiple traits can be mixed into a class to combine their interface and their behavior.

Here, a StarCruiser is a Spacecraft with a CommandoBridge that knows how to engage the ship (provided a means to speed up) and a PulseEngine that specifies how to speed up.

Pattern matching
// Define a set of case classes for representing binary trees.
sealed abstract class Tree
case class Node(elem: Int, left: Tree, right: Tree) extends Tree
case object Leaf extends Tree

// Return the in-order traversal sequence of a given tree.
def inOrder(t: Tree): List[Int] = t match {
  case Node(e, l, r) => inOrder(l) ::: List(e) ::: inOrder(r)
  case Leaf          => List()

Switch on the structure of your data

In Scala, case classes are used to represent structural data types. They implicitly equip the class with meaningful toString, equals and hashCode methods, as well as the ability to be deconstructed with pattern matching.

In this example, we define a small set of case classes that represent binary trees of integers (the generic version is omitted for simplicity here). In inOrder, the match construct chooses the right branch, depending on the type of t, and at the same time deconstructs the arguments of a Node.

Go Functional with Higher-Order Functions

In Scala, functions are values, and can be defined as anonymous functions with a concise syntax.

val people: Array[Person]

// Partition `people` into two arrays `minors` and `adults`.
// Use the higher-order function `(_.age < 18)` as a predicate for partitioning.
val (minors, adults) = people partition (_.age < 18)
List<Person> people;

List<Person> minors = new ArrayList<Person>(people.size());
List<Person> adults = new ArrayList<Person>(people.size());
for (Person person : people) {
    if (person.getAge() < 18)

In the Scala example on the left, the partition method, available on all collection types (including Array), returns two new collections of the same type. Elements from the original collection are partitioned according to a predicate, which is given as a lambda, i.e., an anonymous function. The _ stands for the parameter to the lambda, i.e., the element that is being tested. This particular lambda can also be written as (x => x.age < 18).

The same program is implemented in Java on the right.

Upcoming Events

See more events or add one to our feed

What's New

date icon Monday, October 24, 2016

I am excited to announce the first release of scalafix, a new tool to prepare Scala 2.x code for Dotty, a next generation Scala compiler. This effort follows the Scala Center Advisory Board proposal: “Clarification of Scala to Dotty migration path”.

There is a lot to be excited over in Dotty, faster compilation times, faster+smaller binaries and awesome error messages to name a few things. However, some Scala 2.x applications can’t immediately enjoy these benefits due to several breaking changes in Dotty. Scalafix is part of our strategy to smoothen the migration process for those applications.

Scalafix takes care of easy, repetitive and tedious code transformations so you can focus on the changes that truly deserve your attention. In a nutshell, scalafix reads a source file, transforms usage of unsupported features into newer alternatives, and writes the final result back to the original source file. Scalafix aims to automate the migration of as many unsupported features as possible. There will be cases where scalafix is unable to perform automatic migration. In such situations, scalafix will point to the offending source line and provide instructions on how to refactor the code. The objective is that your Scala 2.x codebase gets up and running faster with Dotty.

What can scalafix do?

Scalafix comes with a command line interface and an SBT plugin. Running scalafix is as simple as:

scalafix Code.scala
# SBT, after adding plugin to project/plugins.sbt
sbt scalafix

More detailed instructions on how to setup scalafix are on the project website.

This initial release implements two rewrites: ProcedureSyntax and VolatileLazyVal. More rewrite rules are planned for future releases.

The ProcedureSyntax rewrite works like this:

// before
def main(args: Seq[String]) { // note lack of '='
  println("Hello scalafix!")
// after
def main(args: Seq[String]): Unit = { // note ': Unit ='
  println("Hello scalafix!")

Dropping procedure syntax is part of a general effort in Dotty to simplify the Scala language.

The VolatileLazyVal rewrite adds a @volatile annotation to lazy vals, like this:

lazy val x = ... // before
@volatile lazy val x = ... // after

With @volatile, Dotty uses a deadlock free scheme that is comparable-if-not-faster than the scheme used in scalac.

How does scalafix work?

All the heavy-lifting behind scalafix happens in scala.meta, a new metaprogramming toolkit aspiring to succeed scala.reflect. Scala.meta makes it easy to do non-trivial code transformations while preserving syntactic details in the original source file. This key attribute of scala.meta makes it suitable for both developer tools like scalafix as well as compile-time metaprograms like macros.

The grand vision of scala.meta is to provide a single interface to accomplish most common metaprogramming tasks in Scala. scala.meta sketch The idea is to untie metaprograms from compiler implementations, providing a platform-independent API that works both in the Scala compiler, Dotty and tools like IntelliJ IDEA. Metaprogram authors benefit from a user-friendly interface built on immutable ADTs and type classes. Compiler authors benefit from the fact that they don’t need to expose compiler internals in order to support popular features like macros.

In scalafix, we use the scala.meta API to parse Scala source files, traverse abstract syntax trees, inspect tokens (including comments!) and then perform the minimum required transformations to the source file. Thanks to scala.meta dialects, which abstract away syntactic differences between variants of Scala, we can use the same programs to manipulate regular Scala source files, SBT files as well as code that uses new Dotty syntax such as union types and trait parameters. All this powerful functionality is implemented behind a minimalistic interface. The NonVolatileLazyVal implementation is 15 lines of code. It could comfortably fit in three tweets.

We would not be able to achieve this level of expressiveness with scala.reflect, the current standard metaprogramming framework for Scala. Why? To name a few reasons, scala.reflect does not preserve syntactic details such as comments and scala.reflect has no notions of dialects, so it is bound to only support Scala 2.x.

How does scalafix evolve?

As we have seen above, the functionality of scalafix heavily relies on the infrastructure provided by scala.meta. However, in order to write more sophisticated scalafix rewrite rules, there are two main features missing in scala.meta, namely a scalac converter and semantic API. We are closely collaborating with Eugene Burmako, the lead developer of scala.meta, in the development efforts of these two features.

Scalac converter

The scalac converter is a key subsystem of scala.meta. For every Scala feature, the converter recognizes its representation in the Scala compiler and translates it into the corresponding data structures of the scala.meta API.

Over the last months, we’ve made great improvements to the converter, which is under development in the scalameta/paradise repository. We have established a comprehensive test suite that consists of a sample of over 26.000 Scala source files collected from popular open-source projects. Thanks to the joint effort of the team of scala.meta contributors, the converter now supports about 19.000 source files form the test corpus. We are continuously working to increase that number, aiming to bring language coverage to 100%.

Semantic API

The semantic API will enable scala.meta programs to inspect compiler information such as type inference, name resolution and other functionality that requires a compilation step. This opens opportunities for new scalafix rewrite rules that cannot be done on a purely syntactic level, like NonVolatileLazyVal and ProcedureSyntax.

We already have a working prototype of the first scalafix rewrite that uses the semantic API: ExplicitImplicit. ExplicitImplicit inserts type annotations for implicit definitions, like this:

// before
implicit val x = (_: String).length
// after
implicit val x: String => Int = (_: String).length

ExplicitImplicit requires the semantic API in order to get the inferred type annotations.

Towards new-style macros

One could say that by contributing to scala.meta, we get two features for the price of one. First, an improved converter and semantic API enables us to implement more sophisticated scalafix rewrite rules. Secondly, we accelerate the development of a new macro system that will work both with scalac and Dotty.

As announced at Scala Days 2016, the current macros based on scala.reflect will be going away. Such macros have a number of known issues, including over complicated API and lacking IDE support. More importantly, the deep coupling between old-style macros and scalac means macros written in the old system effectively cannot be ported to Dotty. The plan is to discontinue support for scala.reflect macros in favor of a new macro system called inline/meta.

A technology preview of the new-style macros came out this summer. This early release is still somewhat rough around the edges. Nevertheless, scalafmt is already using these new macro annotations in production for more than a month now. If you are interested in learning more, the best place to get started is my recent Scala World workshop.

Much like scalafix, new-style macros crucially rely on the infrastructure provided by scala.meta. Concretely, the converter and the semantic APIs constitute a significant chunk of the implementation effort behind the new macro system.

Interestingly, our contributions to scala.meta were originally motivated by the needs of scalafix. However, they are also immediately helping the development of this new universal macro system for Scala. We are excited that our open-source collaboration with the scala.meta team brings multiple benefits for the good of everyone!

Get involved

Are you interested in metaprogramming, developer tools and running experiments on millions of lines of Scala code? Come chat with us on Gitter, and we’ll discuss how you can make a difference in shaping the developer tools of the future!


date-icon Monday, October 24, 2016 blog
It has been almost half a year since we kicked off our first Scala Center Advisory Board meeting in New York. With a board primarily...
date-icon Wednesday, October 19, 2016 blog
This week we release the first version of scalajs-bundler, a tool that makes it easier to use npm packages for Scala.js developers. This effort is...
date-icon Tuesday, October 18, 2016 announcement
We are happy to announce the availability of Scala 2.12.0-RC2! This RC fixes all reported regressions since 2.11. It will become the final by October...
For more, visit our
News archive or Blog

Scala on Twitter

See more tweets, or
Follow Scala on Twitter
white Twitter logo