Object-Oriented Meets Functional

Have the best of both worlds. Construct elegant class hierarchies for maximum code reuse and extensibility, implement their behavior using higher-order functions. Or anything in-between.

Learn More


Scala began life in 2003, created by Martin Odersky and his research group at EPFL, next to Lake Geneva and the Alps, in Lausanne, Switzerland. Scala has since grown into a mature open source programming language, used by hundreds of thousands of developers, and is developed and maintained by scores of people all over the world.
Download API Docs    

Scala in a Nutshell

 click the boxes below to see Scala in action! 

Seamless Java Interop

Scala runs on the JVM, so Java and Scala stacks can be freely mixed for totally seamless integration.

Type Inference

So the type system doesn’t feel so static. Don’t work for the type system. Let the type system work for you!

& Distribution

Use data-parallel operations on collections, use actors for concurrency and distribution, or futures for asynchronous programming.


Combine the flexibility of Java-style interfaces with the power of classes. Think principled multiple-inheritance.

Pattern Matching

Think “switch” on steroids. Match against class hierarchies, sequences, and more.

Higher-Order Functions

Functions are first-class objects. Compose them with guaranteed type safety. Use them anywhere, pass them to anything.

class Author(val firstName: String,
    val lastName: String) extends Comparable[Author] {

  override def compareTo(that: Author) = {
    val lastNameComp = this.lastName compareTo that.lastName
    if (lastNameComp != 0) lastNameComp
    else this.firstName compareTo that.firstName

object Author {
  def loadAuthorsFromFile(file: java.io.File): List[Author] = ???
import static scala.collection.JavaConversions.asJavaCollection;

public class App {
    public List<Author> loadAuthorsFromFile(File file) {
        return new ArrayList<Author>(asJavaCollection(

    public void sortAuthors(List<Author> authors) {

    public void displaySortedAuthors(File file) {
        List<Author> authors = loadAuthorsFromFile(file);
        for (Author author : authors) {
                author.lastName() + ", " + author.firstName());

Combine Scala and Java seamlessly

Scala classes are ultimately JVM classes. You can create Java objects, call their methods and inherit from Java classes transparently from Scala. Similarly, Java code can reference Scala classes and objects.

In this example, the Scala class Author implements the Java interface Comparable<T> and works with Java Files. The Java code uses a method from the companion object Author, and accesses fields of the Author class. It also uses JavaConversions to convert between Scala collections and Java collections.

Type inference
scala> class Person(val name: String, val age: Int) {
     |   override def toString = s"$name ($age)"
     | }
defined class Person

scala> def underagePeopleNames(persons: List[Person]) = {
     |   for (person <- persons; if person.age < 18)
     |     yield person.name
     | }
underagePeopleNames: (persons: List[Person])List[String]

scala> def createRandomPeople() = {
     |   val names = List("Alice", "Bob", "Carol",
     |       "Dave", "Eve", "Frank")
     |   for (name <- names) yield {
     |     val age = (Random.nextGaussian()*8 + 20).toInt
     |     new Person(name, age)
     |   }
     | }
createRandomPeople: ()List[Person]

scala> val people = createRandomPeople()
people: List[Person] = List(Alice (16), Bob (16), Carol (19), Dave (18), Eve (26), Frank (11))

scala> underagePeopleNames(people)
res1: List[String] = List(Alice, Bob, Frank)

Let the compiler figure out the types for you

The Scala compiler is smart about static types. Most of the time, you need not tell it the types of your variables. Instead, its powerful type inference will figure them out for you.

In this interactive REPL session (Read-Eval-Print-Loop), we define a class and two functions. You can observe that the compiler infers the result types of the functions automatically, as well as all the intermediate values.

val x = future { someExpensiveComputation() }
val y = future { someOtherExpensiveComputation() }
val z = for (a <- x; b <- y) yield a*b
for (c <- z) println("Result: " + c)
println("Meanwhile, the main thread goes on!")

Go Concurrent or Distributed with Futures & Promises

In Scala, futures and promises can be used to process data asynchronously, making it easier to parallelize or even distribute your application.

In this example, the future{} construct evaluates its argument asynchronously, and returns a handle to the asynchronous result as a Future[Int]. For-comprehensions can be used to register new callbacks (to post new things to do) when the future is completed, i.e., when the computation is finished. And since all this is executed asynchronously, without blocking, the main program thread can continue doing other work in the meantime.

abstract class Spacecraft {
  def engage(): Unit
trait CommandoBridge extends Spacecraft {
  def engage(): Unit = {
    for (_ <- 1 to 3)
  def speedUp(): Unit
trait PulseEngine extends Spacecraft {
  val maxPulse: Int
  var currentPulse: Int = 0
  def speedUp(): Unit = {
    if (currentPulse < maxPulse)
      currentPulse += 1
class StarCruiser extends Spacecraft
                     with CommandoBridge
                     with PulseEngine {
  val maxPulse = 200

Flexibly Combine Interface & Behavior

In Scala, multiple traits can be mixed into a class to combine their interface and their behavior.

Here, a StarCruiser is a Spacecraft with a CommandoBridge that knows how to engage the ship (provided a means to speed up) and a PulseEngine that specifies how to speed up.

Pattern matching
// Define a set of case classes for representing binary trees.
sealed abstract class Tree
case class Node(elem: Int, left: Tree, right: Tree) extends Tree
case object Leaf extends Tree

// Return the in-order traversal sequence of a given tree.
def inOrder(t: Tree): List[Int] = t match {
  case Node(e, l, r) => inOrder(l) ::: List(e) ::: inOrder(r)
  case Leaf          => List()

Switch on the structure of your data

In Scala, case classes are used to represent structural data types. They implicitly equip the class with meaningful toString, equals and hashCode methods, as well as the ability to be deconstructed with pattern matching.

In this example, we define a small set of case classes that represent binary trees of integers (the generic version is omitted for simplicity here). In inOrder, the match construct chooses the right branch, depending on the type of t, and at the same time deconstructs the arguments of a Node.

Go Functional with Higher-Order Functions

In Scala, functions are values, and can be defined as anonymous functions with a concise syntax.

val people: Array[Person]

// Partition `people` into two arrays `minors` and `adults`.
// Use the higher-order function `(_.age < 18)` as a predicate for partitioning.
val (minors, adults) = people partition (_.age < 18)
List<Person> people;

List<Person> minors = new ArrayList<Person>(people.size());
List<Person> adults = new ArrayList<Person>(people.size());
for (Person person : people) {
    if (person.getAge() < 18)

In the Scala example on the left, the partition method, available on all collection types (including Array), returns two new collections of the same type. Elements from the original collection are partitioned according to a predicate, which is given as a lambda, i.e., an anonymous function. The _ stands for the parameter to the lambda, i.e., the element that is being tested. This particular lambda can also be written as (x => x.age < 18).

The same program is implemented in Java on the right.

Upcoming Events

See more events or add one to our feed

What's New

date icon Monday, July 28, 2014

As with every living programming language, Scala will continue to evolve. This document describes where the core Scala team sees the language going in the medium term and where we plan to invest our efforts.

In a nutshell, our main goals are to make the language and its libraries simpler to understand, more robust, and better performing. The features described in this document span the next three major releases of the Scala distribution. Naturally, the planning for later releases is more tentative and fluid than for earlier ones.

Scala 2.12

Scala 2.12’s main theme is Java 8 interoperability. It will support Java 8 lambdas and streams and will allow easy cross calls with these features in both directions. We recently published a detailed feature list and roadmap for this release.

We have not yet decided on version numbers for the releases beyond 2.12, so for the time being we will use opera names as designators.

Scala “Aida”

This release focuses on improving the standard library.

  1. Cleanups and simplification of the collections library: we plan to reduce the size of the collections library, providing some functionality as separate modules. Generally, we want to make them even easier to use and structure them so that they are more amenable to optimizations. Where needed, breaking changes will be announced using deprecation in Scala 2.12; regular use of the collections will likely be unaffected, but custom collections may need to be adapted to the simplified hierarchy.

    1. Reduce reliance on inheritance
    2. Make all default collections immutable (e.g. scala.Seq will be an alias of scala.immutable.Seq)
    3. Other small cleanups that are possible with a rewriting step (e.g. rename mapValues)
  2. Added functionality: We’d like to introduce several new modules, including a couple of spin-offs from the collections library.

    1. Lazy collections through improved views, including Java 8 streams interop.
    2. Parallel collections with performance improvements obtained from operation fusion and more efficient parallel scheduling.
    3. An integrated abstraction to handle validation.
  3. The (independent) scala.meta project aims to establish a new standard for reflection and macro programming. It will be considered for integration in the standard library once it is mature and stable.

  4. As in every Scala release, we’ll also work on improving compiler performance. Since this release focuses on the library, compiler changes will be strictly internal.

Backwards compatibility and migration strategy: The changes to collections might require source code to be rewritten, even though this should be rare. However, we aim to maintain source code compatibility modulo an automatic migration tool (analogous to go fix for Go) that can do the rewriting automatically. Ideally, that tool should be robust and expressive enough to support cross-building.

Prototypes of the new collection functionality and meta-programming libraries will be made available as separate libraries in the Scala 2.12 timeframe, so that projects can experiment with the new features early.

Scala “Don Giovanni”

The main focus for this release is the Scala programming language and its compiler. The new version should provide clear improvements in simplicity, usability and stability, while at the same time staying backwards compatible with current usage of the language.

Areas that will be investigated include the following:

  1. Cleaned-up syntax: The objective is to more clearly expose Scala’s principle of having few orthogonally composable features.

    1. Trait parameters instead of early definition syntax
    2. XML string interpolation instead of XML literals
    3. Procedure syntax is dropped in favor of always defining functions with =
    4. Simplified and unified type syntax for all forms of information elision: existential types and partial type applications are both expressed with _, forSome syntax is eliminated.
  2. Removing puzzlers: There are some features in Scala which are known to be prone to puzzlers, and which can be made safer by tweaking the language. In particular, the following changes would help:

    1. Result types are mandatory for implicit definitions.
    2. Inherited explicit result types take precedence over locally-inferred ones.
    3. Universal toString conversion and concatenation via + should require explicit enabling.
    4. Avoid surprising behavior of auto-tupling.
  3. Simple foundations: This continues the strive for simplicity on the type systems side. We will identify formerly disparate features as specific instances of a small set of concepts. This will help in understanding the individual features and how they hang together. It will also reduce unwanted feature interactions. In particular:

    1. A single fundamental concept - type members - can give a precise meaning to generics, existential types, wildcards, and higher-kinded types.
    2. Intersection and union types make member selection more regular and avoid blow-ups when computing tight upper and lower bounds of sets of types.
    3. Tuples can be decomposed recursively, overcoming current limits to tuple size, and leading to simpler, streamlined native support for abstractions like HLists or HMaps which are currently implemented in some form or other in various libraries.
    4. The type system will have theoretical foundations that are given by a minimal core calculus (DOT).
  4. Better tooling: We will continue to focus on the tooling side, with the goals to improve batch compiler speed and to make the compiler more adapted to fast incremental compilation and IDE presentation support.

  5. Faster code: We plan to improve performance of generated code using optimizations including:

    1. Robust specialization using Miniboxing techniques, applied to collections (a preview of this may already be available in Aida).
    2. Improvements to value classes: Can be array elements, can play part in specializations, can be multi-field.
    3. Optimized implementation of thread-local lazy vals.

We will collaborate here with the Java effort in project Valhalla, which has similar goals.

Backwards compatibility

Since some features are superseded by others, some source code will have to be rewritten. However, using the migration tool described earlier, common Scala code should port automatically. In particular, we aim that all features described in the latest edition of “Programming in Scala” can be ported automatically. However, the porting guarantee will not extend to features that are labelled “experimental”. For some of these (e.g. macros and reflection), we aim to have a replacement that can fulfill analogous functionality, but using different notation and APIs.


Currently, having a feature on the list does not mean that we have already committed the resources to work on this. The roadmap is intended as a framework for the development of future Scala versions. We are happy to take contributions that implement parts of it that are lower on our priority list. Before starting work on a feature not listed here, it must first be accepted for inclusion in the roadmap.


date-icon Thursday, July 24, 2014 announcement
We are very pleased to announce the release of Scala 2.11.2! Get started with the Hello Scala 2.11 template in Typesafe Activator Download a distribution...
date-icon Monday, June 30, 2014 announcement
Scala 2.12 will require Java 8. Here’s how we plan to make this transition as smooth as possible. Goals Minimize overhead of the transition for...
date-icon Wednesday, May 21, 2014 announcement
We are very pleased to announce the release of Scala 2.11.1! Get started with the Hello Scala 2.11 template in Typesafe Activator Download a distribution...
For more, visit our
News archive or Blog

Scala on Twitter

See more tweets, or
Follow Scala on Twitter
white Twitter logo