camel-spark-kafka-connector sink configuration

When using camel-spark-kafka-connector as sink make sure to use the following Maven dependency to have support for the connector:

<dependency>
  <groupId>org.apache.camel.kafkaconnector</groupId>
  <artifactId>camel-spark-kafka-connector</artifactId>
  <version>x.x.x</version>
  <!-- use the same version as your Camel Kafka connector version -->
</dependency>

To use this Sink connector in Kafka connect you’ll need to set the following connector.class

connector.class=org.apache.camel.kafkaconnector.spark.CamelSparkSinkConnector

The camel-spark sink connector supports 13 options, which are listed below.

Name Description Default Required Priority

camel.sink.path.endpointType

Type of the endpoint (rdd, dataframe, hive). One of: [rdd] [dataframe] [hive]

null

true

HIGH

camel.sink.endpoint.collect

Indicates if results should be collected or counted.

true

false

MEDIUM

camel.sink.endpoint.dataFrame

DataFrame to compute against.

null

false

MEDIUM

camel.sink.endpoint.dataFrameCallback

Function performing action against an DataFrame.

null

false

MEDIUM

camel.sink.endpoint.lazyStartProducer

Whether the producer should be started lazy (on the first message). By starting lazy you can use this to allow CamelContext and routes to startup in situations where a producer may otherwise fail during starting and cause the route to fail being started. By deferring this startup to be lazy then the startup failure can be handled during routing messages via Camel’s routing error handlers. Beware that when the first message is processed then creating and starting the producer may take a little time and prolong the total processing time of the processing.

false

false

MEDIUM

camel.sink.endpoint.rdd

RDD to compute against.

null

false

MEDIUM

camel.sink.endpoint.rddCallback

Function performing action against an RDD.

null

false

MEDIUM

camel.sink.endpoint.basicPropertyBinding

Whether the endpoint should use basic property binding (Camel 2.x) or the newer property binding with additional capabilities

false

false

MEDIUM

camel.sink.endpoint.synchronous

Sets whether synchronous processing should be strictly used, or Camel is allowed to use asynchronous processing (if supported).

false

false

MEDIUM

camel.component.spark.lazyStartProducer

Whether the producer should be started lazy (on the first message). By starting lazy you can use this to allow CamelContext and routes to startup in situations where a producer may otherwise fail during starting and cause the route to fail being started. By deferring this startup to be lazy then the startup failure can be handled during routing messages via Camel’s routing error handlers. Beware that when the first message is processed then creating and starting the producer may take a little time and prolong the total processing time of the processing.

false

false

MEDIUM

camel.component.spark.rdd

RDD to compute against.

null

false

MEDIUM

camel.component.spark.rddCallback

Function performing action against an RDD.

null

false

MEDIUM

camel.component.spark.basicPropertyBinding

Whether the component should use basic property binding (Camel 2.x) or the newer property binding with additional capabilities

false

false

LOW

The camel-spark sink connector has no converters out of the box.

The camel-spark sink connector has no transforms out of the box.

The camel-spark sink connector has no aggregation strategies out of the box.