KNX and Databricks Integration
Powerful performance with an easy integration, powered by Telegraf, the open source data connector built by InfluxData.
5B+
Telegraf downloads
#1
Time series database
Source: DB Engines
1B+
Downloads of InfluxDB
2,800+
Contributors
Table of Contents
Powerful Performance, Limitless Scale
Collect, organize, and act on massive volumes of high-velocity data. Any data is more valuable when you think of it as time series data. with InfluxDB, the #1 time series platform built to scale with Telegraf.
See Ways to Get Started
Input and output integration overview
The KNX plugin listens for messages from the KNX home-automation bus via a KNX-IP interface, allowing for real-time data integration from KNX-enabled devices.
Use Telegraf’s HTTP output plugin to push metrics straight into a Databricks Lakehouse by calling the SQL Statement Execution API with a JSON-wrapped INSERT or volume PUT command.
Integration details
KNX
The KNX plugin allows for the listening to messages transmitted over the KNX home-automation bus. It establishes a connection with the KNX bus through a KNX-IP interface, making it compatible with various message datapoint types that KNX employs. The plugin supports service input configuration, wherein it remains active to listen for relevant metrics or events rather than relying solely on scheduled intervals. This inherent characteristic enables real-time data capture from the KNX systems, enhancing automation and integration possibilities for building management and smart home applications. Additionally, this plugin is designed to handle multiple measurements from the KNX data, allowing for a flexible categorization of metrics based on the derived datapoint types, thus broadening the scope of data integration in smart environments.
Databricks
This configuration turns Telegraf into a lightweight ingestion agent for the Databricks Lakehouse. It leverages the Databricks SQL Statement Execution API 2.0, which accepts authenticated POST requests containing a JSON payload with a statement
field. Each Telegraf flush dynamically renders a SQL INSERT (or, for file-based workflows, a PUT ... INTO /Volumes/...
command) that lands the metrics into a Unity Catalog table or volume governed by Lakehouse security. Under the hood Databricks stores successful inserts as Delta Lake transactions, enabling ACID guarantees, time-travel, and scalable analytics. Operators can point the warehouse_id
at any serverless or classic SQL warehouse, and all authentication is handled with a PAT or service-principal token—no agents or JDBC drivers required. Because Telegraf’s HTTP output supports custom headers, batching, TLS, and proxy settings, the same pattern scales from edge IoT gateways to container sidecars, consolidating infrastructure telemetry, application logs, or business KPIs directly into the Lakehouse for BI, ML, and Lakehouse Monitoring. Unity Catalog volumes provide a governed staging layer when file uploads and COPY INTO
are preferred, and the approach aligns with Databricks’ recommended ingestion practices for partners and ISVs.
Configuration
KNX
[[inputs.knx_listener]]
## Type of KNX-IP interface.
## Can be either "tunnel_udp", "tunnel_tcp", "tunnel" (alias for tunnel_udp) or "router".
# service_type = "tunnel"
## Address of the KNX-IP interface.
service_address = "localhost:3671"
## Measurement definition(s)
# [[inputs.knx_listener.measurement]]
# ## Name of the measurement
# name = "temperature"
# ## Datapoint-Type (DPT) of the KNX messages
# dpt = "9.001"
# ## Use the string representation instead of the numerical value for the
# ## datapoint-type and the addresses below
# # as_string = false
# ## List of Group-Addresses (GAs) assigned to the measurement
# addresses = ["5/5/1"]
# [[inputs.knx_listener.measurement]]
# name = "illumination"
# dpt = "9.004"
# addresses = ["5/5/3"]
Databricks
[[outputs.http]]
## Databricks SQL Statement Execution API endpoint
url = "https://{{ env "DATABRICKS_HOST" }}/api/2.0/sql/statements"
## Use POST to submit each Telegraf batch as a SQL request
method = "POST"
## Personal-access token (PAT) for workspace or service principal
headers = { Authorization = "Bearer {{ env "DATABRICKS_TOKEN" }}" }
## Send JSON that wraps the metrics batch in a SQL INSERT (or PUT into a Volume)
content_type = "application/json"
## Serialize metrics as JSON so they can be embedded in the SQL statement
data_format = "json"
json_timestamp_units = "1ms"
## Build the request body. Telegraf replaces the template variables at runtime.
## Example inserts a row per metric into a Unity-Catalog table.
body_template = """
{
\"statement\": \"INSERT INTO ${TARGET_TABLE} VALUES {{range .Metrics}}(from_unixtime({{.timestamp}}/1000), {{.fields.usage}}, '{{.tags.host}}'){{end}}\",
\"warehouse_id\": \"${WAREHOUSE_ID}\"
}
"""
## Optional: add batching limits or TLS settings
# batch_size = 500
# timeout = "10s"
Input and output integration examples
KNX
-
Smart Home Energy Monitoring: Utilize the KNX plugin to monitor energy consumption across various devices in a smart home setup. By configuring measurements for different appliances, users can gather real-time data on power usage, enabling them to optimize energy consumption and reduce costs. This setup can also integrate with visualization tools to provide insights into energy trends and usage patterns.
-
Automated Lighting Control System: Leverage this plugin to listen for lighting status updates from KNX sensors in a building. By capturing measurements related to illumination, users can develop an automated lighting control system that adjusts the brightness based on the time of day or occupancy, enhancing comfort and energy efficiency.
-
HVAC Performance Tracking: Implement the KNX plugin to track temperature and ventilation data across different zones in a building. By monitoring these metrics, facilities managers can identify trends in HVAC performance, optimize climate control strategies, and proactively address maintenance needs to ensure consistent environmental quality.
-
Integrated Security Solutions: Use the plugin to capture data from KNX security sensors, such as door/window open/close statuses. This information can be routed into a central monitoring system, providing real-time alerts and enabling automated responses, such as locking doors or activating alarms, thus enhancing the building’s security posture.
Databricks
- Edge-to-Lakehouse Telemetry Pipe: Deploy Telegraf on factory PLCs to sample vibration metrics and post them every second to a serverless SQL warehouse. Delta tables power PowerBI dashboards that alert engineers when thresholds drift.
- Blue-Green CI/CD Rollout Metrics: Attach a Telegraf sidecar to each Kubernetes canary pod; it inserts container stats into a Unity Catalog table tagged by
deployment_id
, letting Databricks SQL compare error-rate percentiles and auto-rollback underperforming versions. - SaaS Usage Metering: Insert per-tenant API-call counters via the HTTP plugin; a nightly Lakehouse query aggregates usage into invoices, eliminating custom metering micro-services.
- Security Forensics Lake: Upload JSON batches of Suricata IDS events to a Unity Catalog volume using
PUT
commands, then runCOPY INTO
for near-real-time enrichment with Delta Live Tables, producing a searchable threat-intel lake that joins network logs with user session data.
Feedback
Thank you for being part of our community! If you have any general feedback or found any bugs on these pages, we welcome and encourage your input. Please submit your feedback in the InfluxDB community Slack.
Powerful Performance, Limitless Scale
Collect, organize, and act on massive volumes of high-velocity data. Any data is more valuable when you think of it as time series data. with InfluxDB, the #1 time series platform built to scale with Telegraf.
See Ways to Get Started
Related Integrations
Related Integrations
HTTP and InfluxDB Integration
The HTTP plugin collects metrics from one or more HTTP(S) endpoints. It supports various authentication methods and configuration options for data formats.
View IntegrationKafka and InfluxDB Integration
This plugin reads messages from Kafka and allows the creation of metrics based on those messages. It supports various configurations including different Kafka settings and message processing options.
View IntegrationKinesis and InfluxDB Integration
The Kinesis plugin allows for reading metrics from AWS Kinesis streams. It supports multiple input data formats and offers checkpointing features with DynamoDB for reliable message processing.
View Integration