Quix Cloud Templates
Templates are part of our open source repository of templates, examples and integrations.
Templates help our users connect with other vendors such as AWS and Kafka.
You can explore the README files here in Quix Docs. When you are ready to start using them, head over to the Quix Code Samples GitHub repository, or get a Quix Cloud account.
Start for free
Book a session with us to start for free.
We will create a time-limited free account for you, and our experts will help you get started with your specific use case.
Sources
-
Consume data from a Kafka topic in Confluent Cloud and publish it to a topic in Quix
-
Environment Source
Consume data from a Kafka topic in another environment. Useful to mirror production data to dev environments.
-
Periodically query InfluxDB 2.0 and publish the results to a Kafka topic.
-
Use the InfluxDB 3.0 query API to periodically query InfluxDB and publish the results to a Kafka topic.
-
Use a Kafka Connect source template in the Quix platform
-
Consume data from an MQTT broker and publish it to a Kafka topic.
-
Capture changes to a Postgres database table and publish the change events to a Kafka topic.
-
Periodically query a Redis database and publish the results to a Kafka topic.
-
Capture changes to an SQL database table and publish the change events to a Kafka topic.
-
Read event data from Segment and publish it to a Kafka topic.
-
Read data from Snowplow and publish it to a Kafka topic.
-
Ingest data from a Telegraf source and publish it to a Quix topic
Destinations
-
Consume data from a Kafka topic and write it to an Apache Iceberg table stored in AWS S3 using the AWS Glue Data Catalog.
-
Persist data from Quix to BigQuery
-
Consume data from a Kafka topic in Quix and publish it to a topic in Confluent Cloud
-
Consume data from a Kafka topic in Quix and persist the data to an InfluxDB 3.0 database.
-
Use a Kafka Connect sink template in the Quix platform
-
Consume data from a Kafka topic and publish it to an MQTT broker
-
Consume data from a Kafka topic and persist it to Redis.
-
Consume data from a Kafka topic and send Slack notifications based on your matching criteria.
-
Websocket Destination
Send data from Kafka to a client connected to this websocket server