Stream your ledger data in real-time to analytics systems like ClickHouse, Elasticsearch, or custom HTTP endpoints. This lets you build dashboards, enable full-text search, or trigger external workflows—without polling the API.
What gets streamed
Every change to your ledger is recorded as a log entry. Pipelines stream these logs to your chosen destination:
| Log type | When it’s created |
|---|
NEW_TRANSACTION | A transaction is created |
REVERTED_TRANSACTION | A transaction is reverted |
SET_METADATA | Metadata is added to an account or transaction |
DELETE_METADATA | Metadata is removed |
Quick start
Here’s the complete flow to start streaming data:
1. Create an exporter (defines where data goes)
curl -X POST http://localhost:3068/v2/_/exporters \
-H "Content-Type: application/json" \
-d '{
"driver": "clickhouse",
"config": {
"dsn": "clickhouse://localhost:9000"
}
}'
Response:
{
"data": {
"id": "exp-abc123",
"driver": "clickhouse",
"config": { "dsn": "clickhouse://localhost:9000" },
"createdAt": "2025-01-15T10:00:00Z"
}
}
2. Create a pipeline (connects a ledger to the exporter)
curl -X POST http://localhost:3068/v2/my-ledger/pipelines \
-H "Content-Type: application/json" \
-d '{
"exporterID": "exp-abc123"
}'
Pipelines start automatically. Logs begin streaming immediately.
Exporter drivers
ClickHouse
Streams logs to a ClickHouse database. Creates a logs table automatically.
{
"driver": "clickhouse",
"config": {
"dsn": "clickhouse://user:password@localhost:9000/database"
}
}
| Field | Required | Description |
|---|
dsn | Yes | ClickHouse connection string |
Elasticsearch
Streams logs to an Elasticsearch index for full-text search.
{
"driver": "elasticsearch",
"config": {
"endpoint": "https://localhost:9200",
"index": "ledger-logs",
"authentication": {
"username": "elastic",
"password": "secret"
}
}
}
| Field | Required | Description |
|---|
endpoint | Yes | Elasticsearch URL |
index | No | Index name (default: unified-stack-data) |
authentication.username | No | Username for basic auth |
authentication.password | No | Password for basic auth |
authentication.awsEnabled | No | Use AWS IAM authentication instead |
HTTP
Sends logs as JSON POST requests to any endpoint—useful for webhooks or custom integrations.
{
"driver": "http",
"config": {
"url": "https://your-service.example.com/ledger-logs"
}
}
| Field | Required | Description |
|---|
url | Yes | Destination URL |
Your endpoint receives a JSON array of log entries:
[
{
"ledger": "my-ledger",
"id": 42,
"type": "NEW_TRANSACTION",
"date": "2025-01-15T10:30:00Z",
"data": {
"transaction": {
"id": 1,
"postings": [...],
"metadata": {}
}
}
}
]
Return a 2xx status code to acknowledge receipt.
Managing exporters
# List all exporters
curl http://localhost:3068/v2/_/exporters
# Get a specific exporter
curl http://localhost:3068/v2/_/exporters/{exporterID}
# Delete an exporter
curl -X DELETE http://localhost:3068/v2/_/exporters/{exporterID}
Managing pipelines
Check pipeline status
curl http://localhost:3068/v2/my-ledger/pipelines/{pipelineID}
The lastLogID shows how far the pipeline has progressed:
{
"data": {
"id": "pipe-xyz789",
"ledger": "my-ledger",
"exporterID": "exp-abc123",
"lastLogID": 1042,
"enabled": true,
"createdAt": "2025-01-15T10:00:00Z"
}
}
Stop and restart
Pipelines remember their position. Stop and start without losing progress:
# Stop streaming
curl -X POST http://localhost:3068/v2/my-ledger/pipelines/{pipelineID}/stop
# Resume streaming
curl -X POST http://localhost:3068/v2/my-ledger/pipelines/{pipelineID}/start
Replay from the beginning
Reset the pipeline to re-stream all logs:
curl -X POST http://localhost:3068/v2/my-ledger/pipelines/{pipelineID}/reset
This replays all historical logs. Make sure your destination can handle duplicates or clear it first.
Delete a pipeline
curl -X DELETE http://localhost:3068/v2/my-ledger/pipelines/{pipelineID}