Apache Kudu
The Alpakka Kudu connector supports writing to Apache Kudu tables.
Apache Kudu is a free and open source column-oriented data store in the Apache Hadoop ecosystem.
Project Info: Alpakka Kudu | |
---|---|
Artifact | com.lightbend.akka
akka-stream-alpakka-kudu
2.0.2
|
JDK versions | Adopt OpenJDK 8 Adopt OpenJDK 11 |
Scala versions | 2.12.11, 2.11.12, 2.13.3 |
JPMS module name | akka.stream.alpakka.kudu |
License | |
Readiness level |
Since 0.19, 2018-05-09
|
Home page | https://doc.akka.io/docs/alpakka/current |
API documentation | |
Forums | |
Release notes | In the documentation |
Issues | Github issues |
Sources | https://github.com/akka/alpakka |
Artifacts
- sbt
val AkkaVersion = "2.5.31" libraryDependencies ++= Seq( "com.lightbend.akka" %% "akka-stream-alpakka-kudu" % "2.0.2", "com.typesafe.akka" %% "akka-stream" % AkkaVersion )
- Maven
<properties> <akka.version>2.5.31</akka.version> <scala.binary.version>2.12</scala.binary.version> </properties> <dependency> <groupId>com.lightbend.akka</groupId> <artifactId>akka-stream-alpakka-kudu_${scala.binary.version}</artifactId> <version>2.0.2</version> </dependency> <dependency> <groupId>com.typesafe.akka</groupId> <artifactId>akka-stream_${scala.binary.version}</artifactId> <version>${akka.version}</version> </dependency>
- Gradle
versions += [ AkkaVersion: "2.5.31", ScalaBinary: "2.12" ] dependencies { compile group: 'com.lightbend.akka', name: "akka-stream-alpakka-kudu_${versions.ScalaBinary}", version: '2.0.2', compile group: 'com.typesafe.akka', name: "akka-stream_${versions.ScalaBinary}", version: versions.AkkaVersion }
The table below shows direct dependencies of this module and the second tab shows all libraries it depends on transitively.
- Direct dependencies
Organization Artifact Version com.typesafe.akka akka-stream_2.12 2.5.31 org.apache.kudu kudu-client-tools 1.7.1 org.scala-lang scala-library 2.12.11 - Dependency tree
com.typesafe.akka akka-stream_2.12 2.5.31 com.typesafe.akka akka-actor_2.12 2.5.31 com.typesafe config 1.3.3 org.scala-lang.modules scala-java8-compat_2.12 0.8.0 org.scala-lang scala-library 2.12.11 org.scala-lang scala-library 2.12.11 com.typesafe.akka akka-protobuf_2.12 2.5.31 org.scala-lang scala-library 2.12.11 com.typesafe ssl-config-core_2.12 0.3.8 com.typesafe config 1.3.3 org.scala-lang.modules scala-parser-combinators_2.12 1.1.2 org.scala-lang scala-library 2.12.11 org.scala-lang scala-library 2.12.11 org.reactivestreams reactive-streams 1.0.2 org.scala-lang scala-library 2.12.11 org.apache.kudu kudu-client-tools 1.7.1 org.slf4j slf4j-api 1.7.25 org.scala-lang scala-library 2.12.11
Configuration
To connect to Kudu you need:
- Describe the Kudu
Schema
- Define a converter function to map your data type to a
PartialRow
- Specify Kudu
CreateTableOptions
- Set up Alpakka’s
KuduTableSettings
- Scala
-
// Kudu Schema val cols = List(new ColumnSchema.ColumnSchemaBuilder("key", Type.INT32).key(true).build, new ColumnSchema.ColumnSchemaBuilder("value", Type.STRING).build) val schema = new Schema(cols.asJava) // Converter function case class Person(id: Int, name: String) val kuduConverter: Person => PartialRow = { person => val partialRow = schema.newPartialRow() partialRow.addInt(0, person.id) partialRow.addString(1, person.name) partialRow } // Kudu table options val rangeKeys = List("key") val createTableOptions = new CreateTableOptions().setNumReplicas(1).setRangePartitionColumns(rangeKeys.asJava) // Alpakka settings val kuduTableSettings = KuduTableSettings("test", schema, createTableOptions, kuduConverter)
- Java
-
// Kudu Schema List<ColumnSchema> columns = new ArrayList<>(2); columns.add(new ColumnSchema.ColumnSchemaBuilder("key", Type.INT32).key(true).build()); columns.add(new ColumnSchema.ColumnSchemaBuilder("value", Type.STRING).build()); schema = new Schema(columns); // Converter function Function<Person, PartialRow> kuduConverter = person -> { PartialRow partialRow = schema.newPartialRow(); partialRow.addInt(0, person.id); partialRow.addString(1, person.name); return partialRow; }; // Kudu table options List<String> rangeKeys = Collections.singletonList("key"); CreateTableOptions createTableOptions = new CreateTableOptions().setNumReplicas(1).setRangePartitionColumns(rangeKeys); // Alpakka settings KuduTableSettings<Person> tableSettings = KuduTableSettings.create("tablenameSink", schema, createTableOptions, kuduConverter);
The KuduClient
by default is automatically managed by the connector. Settings for the client are read from the reference.conf file. A manually initialized client can be injected to the stream using KuduAttributes
- Scala
-
val masterAddress = "localhost:7051" val client = new KuduClient.KuduClientBuilder(masterAddress).build system.registerOnTermination(client.shutdown()) val flow: Flow[Person, Person, NotUsed] = KuduTable .flow(kuduTableSettings.withTableName("Flow")) .withAttributes(KuduAttributes.client(client))
- Java
-
final String masterAddress = "localhost:7051"; final KuduClient client = new KuduClient.KuduClientBuilder(masterAddress).build(); system.registerOnTermination( () -> { try { client.shutdown(); } catch (KuduException e) { e.printStackTrace(); } }); final Flow<Person, Person, NotUsed> flow = KuduTable.flow(tableSettings.withTableName("Flow")) .withAttributes(KuduAttributes.client(client));
Writing to Kudu in a Flow
- Scala
-
val flow: Flow[Person, Person, NotUsed] = KuduTable.flow(kuduTableSettings.withTableName("Flow")) val f = Source(11 to 20) .map(i => Person(i, s"zozo_$i")) .via(flow) .runWith(Sink.fold(0)((a, d) => a + d.id))
- Java
-
Flow<Person, Person, NotUsed> flow = KuduTable.flow(tableSettings.withTableName("Flow")); CompletionStage<List<Person>> run = Source.from(Arrays.asList(200, 201, 202, 203, 204)) .map((i) -> new Person(i, String.format("name_%d", i))) .via(flow) .toMat(Sink.seq(), Keep.right()) .run(materializer);
Writing to Kudu with a Sink
- Scala
-
val sink: Sink[Person, Future[Done]] = KuduTable.sink(kuduTableSettings.withTableName("Sink")) val f = Source(1 to 10) .map(i => Person(i, s"zozo_$i")) .runWith(sink)
- Java
-
final Sink<Person, CompletionStage<Done>> sink = KuduTable.sink(tableSettings.withTableName("Sink")); CompletionStage<Done> o = Source.from(Arrays.asList(100, 101, 102, 103, 104)) .map((i) -> new Person(i, String.format("name %d", i))) .runWith(sink, materializer);