Connectors
Connectors are part of our open source repository of samples, examples and integrations.
Connectors help our users connect with other vendors such as AWS and Kafka.
You can explore the connector README files here in Quix Docs. When you are ready to start using them, head over to the Quix Code Samples GitHub repository, or sign up and login to the platform.
Sources
-
Subscribe to data in an Ably channel
-
Subscribe to Azure IoT and publish data to Kafka
-
CoinAPI
Consume currency data from the CoinAPI REST API
-
CoinAPI WebSocket API
Publish real-time currency data from the CoinAPI WebSocket API to a stream
-
Subscribe to your Confluent Kafka topic with this bridge
-
Connect your Kinesis data to a Quix topic with this connector
-
Easily publish data to Quix from a MQTT topic
-
Netatmo
Subscribe to sensor data from your Netatmo devices
-
Stream data from Postgres Database CDC to Quix
-
Subscribe to data in a Google Pub/Sub topic
-
RSS
Integrate an RSS feed into your Quix pipeline
-
Reddit
A Reddit scraper data feed for your Quix pipeline
-
Stream changes to data in a SQL Database
-
Stream Segment event data into Quix
-
Stream Twitter live messages to a Quix topic in real time
Destinations
-
Publish data to an Ably Channel
-
Stream data from Quix to BigQuery
-
Publish data to your Confluent Kafka topic with this bridge
-
Publish Quix streams to Amazon DynamoDB
-
Publish Quix streams to Amazon Kinesis Data Firehose
-
Easily publish data to a MQTT instance
-
Stream data from Quix to a Postgres Database
-
Prometheus
Prometheus metrics for a topic
-
Pushover
Push notifications and messages to your mobile with Pushover
-
Publish data to AWS Redshift
-
Publish Quix data to AWS S3
-
Publish data to a SQL Database
-
Send notifications to Slack via webhooks
-
Publish data to Snowflake
-
Stream data from Quix to a Timescale Database
-
Publish Quix streams to Timestream
-
The most basic code example how to send data in Quix using Twilio