The Aerospike destination writes data to Aerospike.
The Amazon S3 destination writes data to Amazon S3. To write data to an Amazon Kinesis Firehose delivery system, use the Kinesis Firehose destination. To write data to Amazon Kinesis Streams, use the Kinesis Producer destination.
The Cassandra destination writes data to a Cassandra cluster.
Constrained Application Protocol (CoAP) is a web transfer protocol designed for machine-to-machine devices. The CoAP Client destination writes data to a CoAP endpoint. Use the destination to send requests to a CoAP resource URL.
The Couchbase destination writes data to Couchbase Server. Couchbase Server is a distributed NoSQL document-oriented database.
The Elasticsearch destination writes data to an Elasticsearch cluster, including Elastic Cloud clusters (formerly Found clusters). The destination uses the Elasticsearch HTTP API to write each record to Elasticsearch as a document.
The Einstein Analytics destination writes data to Salesforce Einstein Analytics. The destination connects to Einstein Analytics to upload external data to a dataset.
The Flume destination writes data to a Flume source. When you write data to Flume, you pass data to a Flume client. The Flume client passes data to hosts based on client configuration properties.
The HBase destination writes data to an HBase cluster. The destination can write data to HBase as text, binary data, or JSON strings. You can define the data format for each column written to HBase.
The Hive Metastore destination works with the Hive Metadata processor and the Hadoop FS or MapR FS destination as part of the Drift Synchronization Solution for Hive.
The Hive Streaming destination writes data to Hive tables stored in the ORC (Optimized Row Columnar) file format.
The HTTP Client destination writes data to an HTTP endpoint. The destination sends requests to an HTTP resource URL. Use the HTTP Client destination to perform a range of standard requests or use an expression to determine the request for each record.
The InfluxDB destination writes data to an InfluxDB database.
The JDBC Producer destination uses a JDBC connection to write data to a database table. You can also use the JDBC Producer to write change capture data from a Microsoft SQL Server change log.
The Kafka Producer destination writes data to a Kafka cluster.
The Kinesis Firehose destination writes data to an Amazon Kinesis Firehose delivery stream. Firehose automatically delivers the data to the Amazon S3 bucket or Amazon Redshift table that you specify in the delivery stream.
The Kinesis Producer destination writes data to Amazon Kinesis Streams. To write data to an Amazon Kinesis Firehose delivery system, use the Kinesis Firehose destination. To write data to Amazon S3, use the Amazon S3 destination.
The Kudu destination writes data to a Kudu cluster.
The MapR DB destination writes data to MapR DB binary tables. The destination can write data to MapR DB as text, binary data, or JSON strings. You can define the data format for each column written to MapR DB.
The MapR FS destination writes files to MapR FS. You can write the data to MapR as flat files or Hadoop sequence files.
The MapR Streams Producer destination writes messages to MapR Streams.
The MQTT Publisher destination publishes messages to a topic on an MQTT broker. The destination functions as an MQTT client that publishes messages, writing each record as a message.
The Named Pipe destination writes data to a UNIX named pipe.
The Pulsar Producer destination writes data to topics in an Apache Pulsar cluster.
RabbitMQ Producer writes AMQP messages to a single RabbitMQ queue.
The Redis destination writes data to Redis.
The Salesforce destination writes data to Salesforce objects.
The SDC RPC destination enables connectivity between two SDC RPC pipelines. The SDC RPC destination passes data to one or more SDC RPC origins. Use the SDC RPC destination as part of an SDC RPC origin pipeline.
The Solr destination writes data to a Solr node or cluster.
The Splunk destination writes data to Splunk using the Splunk HTTP Event Collector (HEC).
The Syslog destination writes syslog messages to a Syslog server.
The WebSocket Client destination writes data to a WebSocket endpoint. Use the destination to send data to a WebSocket resource URL.