Events API
API for event-based pipeline execution.
Overview
Through the Events API, you can execute pipelines using events from Kafka topics as triggers. Serverless event processing is performed based on Knative Eventing.
Prerequisites
- The pipeline must have a Kafka topic specified in the
eventoption. - Knative Eventing and Kafka source must be installed in the cluster.
Endpoints
POST /api/v1/events/{pipeline_id}
Starts the event trigger for a pipeline.
Path Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
pipeline_id | string | Yes | Pipeline ID |
Query Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
pipeline_version | string | No | Use specific version |
Response
200 OK
{
"message": "Event started successfully"
}
Behavior Description
- Deletes existing Knative service and source if present.
- Creates a new Knative service.
- Creates a Kafka source to subscribe to the specified topic.
- When a message arrives at the topic, the pipeline is executed.
Error Responses
| Status Code | Description |
|---|---|
| 500 Internal Server Error | Pipeline does not have event option |
GET /api/v1/events/{pipeline_id}
Retrieves the status of an event trigger.
Path Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
pipeline_id | string | Yes | Pipeline ID |
Response
200 OK
{
"ready": true
}
| Field | Type | Description |
|---|---|---|
ready | boolean | Knative service ready status |
DELETE /api/v1/events/{pipeline_id}
Stops and deletes the event trigger.
Path Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
pipeline_id | string | Yes | Pipeline ID |
Response
200 OK
{
"message": "Event deleted successfully"
}
Pipeline Configuration
To use event triggers, you must specify a Kafka topic in the pipeline options.
{
"id": "pipeline-abc123",
"name": "Event Pipeline",
"options": {
"event": "my-kafka-topic"
},
"steps": [...]
}
Architecture
┌──────────────┐ ┌─────────────────┐ ┌──────────────────┐
│ Kafka Topic │────>│ Knative Source │────>│ Knative Service │
│ (event) │ │ (KafkaSource) │ │ (Pipeline Runner)│
└──────────────┘ └─────────────────┘ └──────────────────┘
- Kafka Topic: Topic where event messages are published
- Knative Source: Subscribes to Kafka topic and delivers events
- Knative Service: Receives events and executes pipeline
Usage Examples
cURL
# Start event trigger
curl -X POST https://api.dhub.io/api/v1/events/pipeline-abc123 \
-H "Authorization: Bearer <access_token>"
# Get event status
curl https://api.dhub.io/api/v1/events/pipeline-abc123 \
-H "Authorization: Bearer <access_token>"
# Stop event trigger
curl -X DELETE https://api.dhub.io/api/v1/events/pipeline-abc123 \
-H "Authorization: Bearer <access_token>"
Event-Based Pipeline Workflow
// 1. Create pipeline (including event option)
await fetch('/api/v1/pipelines/', {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
name: 'Order Processing',
options: {
event: 'orders-topic'
},
steps: [...]
})
});
// 2. Start event trigger
await fetch(`/api/v1/events/${pipelineId}`, {
method: 'POST',
headers: { 'Authorization': `Bearer ${token}` }
});
// 3. Check status
const status = await fetch(`/api/v1/events/${pipelineId}`, {
headers: { 'Authorization': `Bearer ${token}` }
}).then(res => res.json());
console.log('Event ready:', status.ready);
Scaling
Knative automatically scales based on traffic:
- Scale to Zero: Instances scale down to 0 when there are no events
- Auto-scaling: Automatic scaling based on event processing volume