sbt Example: Lagom Java

Here are instructions for how to take a sample application and add telemetry to it for Lagom. In this example you will add Cinnamon and a Coda Hale Console reporter will be used to print telemetry output to the terminal window.

Prerequisites

The following must be installed for these instructions to work:

  • Java
  • sbt
  • Bintray credentials

Bintray credentials

Follow these instructions to set up your Bintray credentials for sbt.

Sample application

We will use a stripped down version of the Hello Service from the Java with sbt version of the Lagom getting started guide. To demonstrate circuit breaker metrics, we will need to create a second service call that does an outgoing service call to the Hello Service.

Start off by creating a folder lagom-java-example with the following files (content of files will be added later):

  • build.sbt
  • project/build.properties
  • project/plugins.sbt
  • hello-service/src/main/java/cinnamon/lagom/api/HelloService.java
  • hello-service-impl/src/main/resources/application.conf
  • hello-service-impl/src/main/java/cinnamon/lagom/impl/HelloServiceImpl.java
  • hello-service-impl/src/main/java/cinnamon/lagom/impl/HelloModule.java

Next step is to add content to the files created.

Add to build.sbt:

organization in ThisBuild := "cinnamon.lagom.example"
version in ThisBuild := "0.1-SNAPSHOT"

// The Scala version that will be used for cross-compiled libraries
scalaVersion in ThisBuild := "2.12.4"

// This sample is not using Cassandra or Kafka
lagomCassandraEnabled in ThisBuild := false
lagomKafkaEnabled in ThisBuild := false

lazy val helloServiceApi = project
  .in(file("hello-service-api"))
  .settings(common: _*)
  .settings(
    libraryDependencies ++= Seq(
      lagomJavadslApi
    )
  )

lazy val helloServiceImpl = project
  .in(file("hello-service-impl"))
  .enablePlugins(LagomJava, Cinnamon)
  .settings(common: _*)
  .settings(
    // Enable Cinnamon during tests
    cinnamon in test := true,
    // Add a play secret to javaOptions in run in Test, so we can run Lagom forked
    javaOptions in (Test, run) += "-Dplay.http.secret.key=x",
    libraryDependencies ++= Seq(
      // Use Coda Hale Metrics and Lagom instrumentation
      Cinnamon.library.cinnamonCHMetrics,
      Cinnamon.library.cinnamonLagom
    )
  )
  .dependsOn(helloServiceApi)

lazy val common = Seq(
  javacOptions in compile ++= Seq("-encoding", "UTF-8", "-source", "1.8", "-target", "1.8", "-Xlint:unchecked", "-Xlint:deprecation"),
  // See https://github.com/FasterXML/jackson-module-parameter-names
  javacOptions in compile += "-parameters"
)

Add to project/build.properties:

sbt.version=1.1.5

Add to project/plugins.sbt:

// The Lagom plugin
addSbtPlugin("com.lightbend.lagom" % "lagom-sbt-plugin" % "1.4.7")
// Needed for importing the project into Eclipse
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" % "5.2.4")
// The Cinnamon Telemetry plugin
addSbtPlugin("com.lightbend.cinnamon" % "sbt-cinnamon" % "2.10.0")
// Credentials and resolver to download the Cinnamon Telemetry libraries
credentials += Credentials(Path.userHome / ".lightbend" / "commercial.credentials")
resolvers += Resolver.url("lightbend-commercial", url("https://repo.lightbend.com/commercial-releases"))(Resolver.ivyStylePatterns)

Make sure that the credentials setting points to where you chose to create the commercial credentials file.

Add to hello-service/src/main/java/cinnamon/lagom/api/HelloService.java:

package cinnamon.lagom.api;

import akka.NotUsed;
import com.lightbend.lagom.javadsl.api.*;
import static com.lightbend.lagom.javadsl.api.Service.*;

public interface HelloService extends Service {
    /**
     * Example: curl http://localhost:9000/api/hello/Alice
     */
    ServiceCall<NotUsed, String> hello(String id);

    /**
     * Example: curl http://localhost:9000/api/hello-proxy/Alice
     */
    ServiceCall<NotUsed, String> helloProxy(String id);

    @Override
    default Descriptor descriptor() {
        return named("hello").withCalls(
            pathCall("/api/hello/:id", this::hello),
            pathCall("/api/hello-proxy/:id", this::helloProxy)
        ).withAutoAcl(true);
    }
}

Add to hello-service-impl/src/main/java/cinnamon/lagom/impl/HelloServiceImpl.java:

package cinnamon.lagom.impl;

import akka.NotUsed;
import cinnamon.lagom.api.HelloService;
import com.lightbend.lagom.javadsl.api.ServiceCall;
import java.util.concurrent.CompletionStage;
import javax.inject.Inject;
import static java.util.concurrent.CompletableFuture.completedFuture;

public class HelloServiceImpl implements HelloService {

    private final HelloService helloService;

    @Inject
    public HelloServiceImpl(HelloService helloService) {
        this.helloService = helloService;
    }

    @Override
    public ServiceCall<NotUsed, String> hello(String id) {
        return request -> {
            return completedFuture("Hello " + id);
        };
    }

    @Override
    public ServiceCall<NotUsed, String> helloProxy(String id) {
        return msg -> {
            CompletionStage<String> response = helloService.hello(id).invoke(NotUsed.getInstance());
            return response.thenApply(answer -> "Hello service said: " + answer);
        };
    }

}

Note: Since circuit breakers are used for outgoing connections in Lagom, we have a service call, named hello-proxy, that does an outgoing call to the first service call, named hello.

Now you need to register the service in a module. Please also note that we are registering the ConfigurationServiceLocator, so that we can run this sample standalone from inside sbt.

Add to hello-service-impl/src/main/java/cinnamon/lagom/impl/HelloModule.java:

package cinnamon.lagom.impl;

import cinnamon.lagom.api.HelloService;
import com.google.inject.AbstractModule;
import com.lightbend.lagom.javadsl.api.ServiceLocator;
import com.lightbend.lagom.javadsl.client.ConfigurationServiceLocator;
import com.lightbend.lagom.javadsl.server.ServiceGuiceSupport;
import com.typesafe.config.Config;
import play.Environment;

public class HelloModule extends AbstractModule implements ServiceGuiceSupport {

    private final Environment environment;
    private final Config config;

    public HelloModule(Environment environment, Config config) {
        this.environment = environment;
        this.config = config;
    }

    @Override
    protected void configure() {
        bindService(HelloService.class, HelloServiceImpl.class);

        // Only needed to allow the sample to run from test and dist with the config Service Locator
        if (environment.isProd()) {
            bind(ServiceLocator.class).to(ConfigurationServiceLocator.class);
        }
    }
}

Finally, add to hello-service-impl/src/main/resources/application.conf:

play.modules.enabled += cinnamon.lagom.impl.HelloModule

lagom.circuit-breaker.default.max-failures = 10

// Only needed to allow the sample to run from test and dist with the config Service Locator
lagom.services {
  hello  = "http://localhost:9000"
}

lagom.spi.circuit-breaker-metrics-class = "cinnamon.lagom.CircuitBreakerInstrumentation"

cinnamon {

  application = "hello-lagom"

  chmetrics.reporters += "console-reporter"

  akka.actors {
    default-by-class {
      includes = "/user/*"
      report-by = class
      excludes = ["akka.http.*", "akka.stream.*"]
    }
  }

  lagom.http {
    servers {
      "*:*" {
        paths {
          "*" {
            metrics = on
          }
        }
      }
    }
    clients {
      "*:*" {
        paths {
          "*" {
            metrics = on
          }
        }
      }
    }
  }
}

The application.conf file is where the actual wiring of telemetry takes place so let us dissect this further:

Setting Explanation
lagom.spi.circuit-breaker-metrics-class The class that should handle all circuit breaker related metrics. Lightbend Telemetry provides an implementation for this. You can also use your own implementation if you would like. For more information see the Lagom SPI implementation.
cinnamon.chmetrics.reporters Specifies the Coda Hale reporter you wish to use. For more information see Coda Hale. Note that there are other ways to send data, e.g. StatsD or OverOps.
cinnamon.akka.actors Specifies which actors to collect metrics for. For more information see Actor Configuration.
cinnamon.lagom.http Specifies which HTTP servers and clients to collect metrics for. For more information see Lagom Configuration.

The lagom.services describes where to find the hello service for outgoing service calls, and is only needed since we use the ConfigurationServiceLocator, so that we can run this sample standalone from inside sbt.

Note: A Lagom application normally consists of multiple projects, one for each microservice, and you need to make sure that there is an application.conf file for each project that you would like to instrument.

Running

When you have added the files above you need to start the application. To start your application forked you can run the underlying Play server in the test scope.

Note: Lagom has a special development mode for rapid development, and does not fork the JVM when using the runAll or run commands in sbt. A forked JVM is necessary to gain metrics for actors and HTTP calls, since those are provided by the Cinnamon Java Agent.

> sbt "helloServiceImpl/test:runMain play.core.server.ProdServerStart"

The output should look something like this:

...
[info] Running (fork) play.core.server.ProdServerStart
...
[info] [INFO] [01/31/2018 10:54:11.118] [CoreAgent] Cinnamon Agent version 2.10.0
...
[info] 2018-01-31T09:54:15.001Z [info] play.api.Play [] - Application started (Prod)
[info] 2018-01-31T09:54:15.534Z [info] play.core.server.AkkaHttpServer [] - Listening for HTTP on /0:0:0:0:0:0:0:0:9000
[info] 1/31/18 10:54:18 AM ============================================================
[info] -- Gauges ----------------------------------------------------------------------
[info] metrics.akka.systems.application.dispatchers.akka_actor_default-dispatcher.active-threads
[info]              value = 0
[info] metrics.akka.systems.application.dispatchers.akka_actor_default-dispatcher.parallelism
[info]              value = 8
[info] metrics.akka.systems.application.dispatchers.akka_actor_default-dispatcher.pool-size
[info]              value = 3
[info] metrics.akka.systems.application.dispatchers.akka_actor_default-dispatcher.queued-tasks
[info]              value = 0
[info] metrics.akka.systems.application.dispatchers.akka_actor_default-dispatcher.running-threads
[info]              value = 0
[info] metrics.akka.systems.application.dispatchers.akka_io_pinned-dispatcher.active-threads
[info]              value = 1
[info] metrics.akka.systems.application.dispatchers.akka_io_pinned-dispatcher.pool-size
[info]              value = 1
[info] metrics.akka.systems.application.dispatchers.akka_io_pinned-dispatcher.running-threads
[info]              value = 0
...

To try out the hello-proxy service call and see metrics for the HTTP endpoints, as well as the hello circuit breaker you can either point your browser to http://localhost:9000/api/hello-proxy/World or simply run curl from the command line like this curl http://localhost:9000/api/hello-proxy/World

The output from the server should now also contain metrics like this:

[info] -- Gauges ----------------------------------------------------------------------
...
[info] metrics.lagom.circuit-breakers.hello.state
[info]              value = 3
...
[info] -- Histograms ------------------------------------------------------------------
[info] metrics.akka-http.systems.application.http-servers.0_0_0_0_0_0_0_1_9000.request-paths._api_hello-proxy__id.endpoint-response-time
[info]              count = 1
[info]                min = 472903565
[info]                max = 472903565
[info]               mean = 472903565.00
[info]             stddev = 0.00
[info]             median = 472903565.00
[info]               75% <= 472903565.00
[info]               95% <= 472903565.00
[info]               98% <= 472903565.00
[info]               99% <= 472903565.00
[info]             99.9% <= 472903565.00
...
[info] metrics.lagom.circuit-breakers.hello.latency
[info]              count = 1
[info]                min = 331715012
[info]                max = 331715012
[info]               mean = 331715012.00
[info]             stddev = 0.00
[info]             median = 331715012.00
[info]               75% <= 331715012.00
[info]               95% <= 331715012.00
[info]               98% <= 331715012.00
[info]               99% <= 331715012.00
[info]             99.9% <= 331715012.00
...