In this section:

Overview

Apache Avro is a schema-based data serialization system. The Apache Avro extension enables SOAtest and Virtualize to support the Avro message format. It can read schemas from a Confluent Schema Registry or saved to a local drive. When using a Confluent Schema Registry, a magic byte and the 4-byte schema ID will be serialized in front of the Avro binary message for the associated transport's wire format.

Installing the extension adds the Apache Avro Client for SOAtest and the Apache Avro Message Responder for Virtualize.

Requirements

Installation

This artifact can be installed from the UI or the command line.

UI Installation

  1. Go to Parasoft > Preferences and click System Properties.
  2. Click Add JARs and choose the com.parasoft.soavirt.messages.avro-<version>.jar.
  3. Click Apply.
  4. Restart SOAtest/Virtualize.

Command Line Installation

Add the Avro .jar file to the system.properties.classpath property in your settings properties file. For example:

system.properties.classpath=<path to jar>/com.parasoft.soavirt.messages.avro-1.0.0.jar

Configuring Avro Schemas Location

The location of your Avro schema files is configured through a Java system property. This can be either the path to a local directory in which you have saved schemas or the URL of a Confluent Schema Registry.

-J-Dcom.parasoft.soavirt.messages.avro.schemas=<PATH_TO_SCHEMA_FOLDER or SCHEMA_REGISTRY_URL>

When specifying a Confluence Schema Registry, be sure to include full URL (including the http:// or https://).

Usage

Adding the Avro Clients

You can add standalone Avro clients to your suite using the Add Test wizard or chain Avro tools as a payload output of an existing tool using the Add Output wizard. See Adding Projects, .tst files, and Test Suites for details.

Configuration and Usage

Avro clients function in a similar manner to other messaging clients, but are configured to use the Avro format by default. In addition, there are a few configuration options that are unique to Avro clients:

XML Conversion Options

Schema for Modeling Request Payload Using Form Input

Adding Avro Message Responders

You can add Avro message responders to your suite using the Add Responder wizard. See Creating Message Responders Manually for details. 

Configuration and Usage

Avro message responders function in a similar manner to other responders. Refer to Message Responder Overview for information on configuring and using responders.  In addition, there are a few configuration options that are unique to Avro message responders:

XML Conversion Options

Schema for Modeling Request Payload Using Form Input

You can also configure the responder to use a different Avro message type for incoming requests. To do so:

  1. In the message responder, click the Options tab and choose Request Handling.
  2. Enable Convert incoming request to XML using different message format than response.
  3. Choose Apache Avro from the Format menu.
  4. Choose the message type that is expected to come in to the virtual asset from the Message type menu that appears. Type definitions available in the schemas in the Avro schemas folder or Confluent schema registry specified previously (see Configuring Avro Schemas Location).

Using Avro with Kafka

One of the benefits of using the Avro extension is the ability to use it in conjunction with the Kafka Transport in order to send messages in one format and receive messages in another format, like in the test suite shown below.

The following example lays out a simple workflow for setting up such a scenario:

  1. If the Kafka record key is also Avro, add an Apache Avro Client to your test suite and configure it as follows, then save it:
  2. If the Kafka record key is also Avro, add an XML Data Bank output to the Avro client (right-click the Avro client and choose Add Output > Request > Payload Modeled as XML > XML Data Bank) and configure it as follows, then save it:
  3. Add another Apache Avro Client to the test suite (this will be a producer) and configure it as follows, then save it:
  4. If the consumed record is a different Avro schema than the record that was produced, add another Apache Avro Client to the test suite (this will be a consumer) and configure it as follows, then save it: