You can export data on an ExtraHop Discover appliance to any Kafka server for
long-term archiving and comparison with other sources.
-
Log in to the Administration page on the ExtraHop system through
https://<extrahop-hostname-or-IP-address>/admin.
-
In the System Configuration section, click Open
Data Streams.
-
Click Add Target.
-
From the Target Type drop-down menu, select
Kafka.
-
In the Name field, type a name to identify the
target.
-
From the Compression drop-down list, select one of the
following compression methods that will be applied to the transmitted
data:
-
From the Partition strategy drop-down list, select one of
the following partitioning methods that will be applied to the transmitted
data:
- Default (Hash Key)
- Manual
- Random
- Round Robin
-
Specify at least one Kafka broker, also referred to as a node in a Kafka
cluster, that can receive transmitted data.
Note: | You can add multiple brokers that are part of the same Kafka cluster to
ensure connectivity in case a single broker is unavailable. All brokers must
be part of the same cluster. |
-
In the Host field, type the hostname or IP address
of the Kafka broker.
-
In the Port field, type the port number of the
Kafka broker.
-
Click the plus (+) icon.
- (Optional):
Click Test to establish a connection between the
Discover appliance and the remote Kafka server and send a test message to the
server.
The dialog box displays a message that indicates whether the connection
succeeded or failed. If the test fails, edit the target configuration and test
the connection again.
-
Click Save.
Next steps
Create a trigger that specifies what Kafka message data to send and initiates the
transmission of data to the target. For more information, see the
Remote.Kafka class in the
ExtraHop Trigger API Reference.
Thank you for your feedback. Can we contact you to ask follow up questions?