Configure a Kafka connection

This feature is only available in the Test Automation for SAP API Scan and not in Tricentis Test Automation for SAP Commander. Tricentis Test Automation for SAP does not support the execution of TestCases using this feature.

To learn about upgrade options, check out SAP Enterprise Continuous Testing by Tricentis.

You can configure an Apache Kafka connection in the API Connection Manager.

This allows you to connect to a Kafka topic which stores messages that are sent by one or more publishers and read by one or more consumers. For detailed information on Kafka, see the Apache Kafka documentation.

Authentication with the Kafka connection uses the SASL/PLAIN mechanism. The Kafka connection is available only if SASL/PLAIN authentication is enabled.

Configure your connection

To configure a Kafka connection, follow the steps below:

  1. Open the API Connection Manager.

  2. Add a new connection.

  3. Go to the Edit section.

  1. Specify a Name for your connection.

  2. From the Type drop-down menu, select Kafka.

  3. Enter the name of the Topic you want to connect to.

  4. Specify the Host. This is the name or IP address of the Kafka server host.

  5. Enter the Port to listen to.

  6. Specify the GroupId of the consumer group. This is the name of the group of consumers that subscribed to this topic.

  7. Enter the number of the Partition that stores the records (messages).

    Kafka topics are split into multiple partitions which allows for a distribution of large amounts of data on one or more servers. Each partition has a number such as 0 or 5.

  8. Enable Peeking to retrieve records from a partition without committing.

    This allows you to read any record, regardless of its position in the partition and without influencing the offset. The offset, which defines the position of a record, indicates which records have already been consumed and which unread record the API Engine should pull next.

  9. If your connection requires authentication, enter your Username and Password. Only SASL/PLAIN authentication is supported.

  10. Optionally, configure XML and web service security.

  11. Optionally, add Avro schema-based serialization for Kafka messages to define the data schema for a record's value. This schema describes the fields allowed in the value, along with their data types. For detailed information on Avro schema, see the Apache Avro documentation.

    From the Schema Type drop-down menu, select Avro and add a Schema Registry Url and Schema that defines the data structure in a JSON format.

  12. To pull Avro serialized Kafka messages, provide the corresponding Avro schema and optionally change the Key Deserializer Type from the drop-down menu, which is Ignore by default.

Configure a Kafka connection

You can now use your Kafka connection for testing. For detailed information on how to run Kafka topics, see chapter "Run Kafka messages".

Avro schema serialization format support

You can define and use the following Avro schema types:

Connection type

Avro schema type

Supported

Partly supported*

Apache Kafka version 2.1.0

Null

 

Boolean

 

Int

 

Long

 

Float

 

Double

 

Bytes

 

String

 

Record*

 

Enumeration

   

Array

 

Map

   

Union

 

Fixed

   

Error

   

Logical

   
*Record schema type differs as it contains fields of different types. Currently, the following types are supported by Tricentis as field types: Boolean, Int, Long, Float, Double, Bytes, String, Record, Null, Array, and Union.