Skip to content

Connectors

Connectors are part of our open source repository of samples, examples and integrations.

Connectors help our users connect with other vendors such as AWS and Kafka.

You can explore the connector README files here in Quix Docs. When you are ready to start using them, head over to the Quix Code Samples GitHub repository, or get a Quix Cloud account.

Start for free

Book a session with us to start for free.

We will create a time-limited free account for you, and our experts will help you get started with your specific use case.

Sources

  • Confluent Kafka Source


    Consume data from a Kafka topic in Confluent Cloud and publish it to a topic in Quix

    Confluent Kafka Source

  • Environment Source


    Consume data from a Kafka topic in another environment. Useful to mirror production data to dev environments.

    Environment Source

  • InfluxDB 2.0 Source


    Periodically query InfluxDB 2.0 and publish the results to a Kafka topic.

    InfluxDB 2.0 Source

  • InfluxDB 3.0 Source


    Use the InfluxDB 3.0 query API to periodically query InfluxDB and publish the results to a Kafka topic.

    InfluxDB 3.0 Source

  • Kafka Connect Source


    Install a Kafka Connect source connector in the Quix platform

    Kafka Connect Source

  • MQTT Source


    Consume data from an MQTT broker and publish it to a Kafka topic.

    MQTT Source

  • Postgres CDC Source


    Capture changes to a Postgres database table and publish the change events to a Kafka topic.

    Postgres CDC Source

  • Redis Source


    Periodically query a Redis database and publish the results to a Kafka topic.

    Redis Source

  • SQL Change Data Capture Source


    Capture changes to an SQL database table and publish the change events to a Kafka topic.

    SQL Change Data Capture Source

  • Segment Source


    Read event data from Segment and publish it to a Kafka topic.

    Segment Source

  • Snowplow Source


    Read data from Snowplow and publish it to a Kafka topic.

    Snowplow Source

Destinations

  • AWS S3 Iceberg Sink


    Consume data from a Kafka topic and write it to an Apache Iceberg table stored in AWS S3 using the AWS Glue Data Catalog.

    AWS S3 Iceberg Sink

  • BigQuery Sink


    Persist data from Quix to BigQuery

    BigQuery Sink

  • Confluent Kafka Sink


    Consume data from a Kafka topic in Quix and publish it to a topic in Confluent Cloud

    Confluent Kafka Sink

  • InfluxDB 3.0 Sink


    Consume data from a Kafka topic in Quix and persist the data to an InfluxDB 3.0 database.

    InfluxDB 3.0 Sink

  • Kafka Connect Sink


    Install a Kafka Connect sink connector in the Quix platform

    Kafka Connect Sink

  • MQTT Sink


    Consume data from a Kafka topic and publish it to an MQTT broker

    MQTT Sink

  • Redis Sink


    Consume data from a Kafka topic and persist it to Redis.

    Redis Sink

  • Slack Sink


    Consume data from a Kafka topic and send Slack notifications based on your matching criteria.

    Slack Sink

  • Websocket Destination


    Send data from Kafka to a client connected to this websocket server

    Websocket Destination