Product Requirements
- RevealX Enterprise and ExtraHop Performance systems
Thank you! We will contact you soon to ask how we can improve our documentation. We appreciate your feedback.
Was this topic helpful?
How can we improve?
*This field is required. Please let us know how we can provide you with better help.
Need more help?
Ask the Community
Configure a Kafka target for an open data stream
You can export data on an ExtraHop system to any Kafka server for long-term archiving and comparison with other sources.
-
Log in to the Administration settings on the ExtraHop system through
https://<extrahop-hostname-or-IP-address>/admin.
Repeat these steps on each sensor in your environment.
- In the System Configuration section, click Open Data Streams.
- Click Add Target.
- From the Target Type drop-down list, select Kafka.
- In the Name field, type a name to identify the target.
-
From the Compression drop-down list, select one of the
following compression methods that will be applied to the transmitted
data:
- None
- GZIP
- Snappy
-
From the Partition strategy drop-down list, select one of
the following partitioning methods that will be applied to the transmitted
data:
- Default (Hash Key)
- Manual
- Random
- Round Robin
- (Optional):
Configure SASL/SCRAM authentication.
- From the Authentication drop-down list, select SASL/SCRAM.
- In the Username field, type the name of the SASL/SCRAM user.
- In the Password field, type the password of the SASL/SCRAM user.
- From the Hashing Algorithm drop-down list, select the hashing algorithm for SASL authentication.
-
From the Protocol drop-down list, select one of the
following protocols over which to transmit data:
- TCP
- SSL/TLS
- (Optional):
If you selected the SSL/TLS protocol, specify
certificate options.
- If the Kafka server requires client authentication, in the Client certificate field, specify a TLS client certificate to send to the server.
- If you specified a client certificate, in the Client key field, specify the private key of the certificate.
- If you do not want to verify the certificate of the Kafka server, select Skip server certificate verification.
- If you want to verify the certificate of the Kafka server, but the certificate has not been signed by a valid Certificate Authority (CA), in the CA certificates (optional) field, specify trusted certificates, in PEM format, with which to verify the server certificate. If this option is not specified, the server certificate is validated with the built-in list of valid CA certificates.
-
Specify at least one Kafka broker, also referred to as a node in a Kafka
cluster, that can receive transmitted data.
Note: You can add multiple brokers that are part of the same Kafka cluster to ensure connectivity in case a single broker is unavailable. All brokers must be part of the same cluster. - In the Host field, type the hostname or IP address of the Kafka broker.
- In the Port field, type the port number of the Kafka broker.
- Click the plus (+) icon.
- (Optional):
Click Test to establish a connection between the
ExtraHop system and the remote Kafka server and send a test message to the
server.
The dialog box displays a message that indicates whether the connection succeeded or failed.
Tip: If the test fails, check the logs on your Kafka server for more detailed information about the error, then edit the target configuration and test the connection again. - Click Save.
Next steps
Create a trigger that specifies what Kafka message data to send and initiates the transmission of data to the target. For more information, see the Remote.Kafka class in the ExtraHop Trigger API Reference.
Thank you for your feedback. Can we contact you to ask follow up questions?