Skip to main content

Events API

API for event-based pipeline execution.

Overview

Through the Events API, you can execute pipelines using events from Kafka topics as triggers. Serverless event processing is performed based on Knative Eventing.

Prerequisites

  • The pipeline must have a Kafka topic specified in the event option.
  • Knative Eventing and Kafka source must be installed in the cluster.

Endpoints

POST /api/v1/events/{pipeline_id}

Starts the event trigger for a pipeline.

Path Parameters

ParameterTypeRequiredDescription
pipeline_idstringYesPipeline ID

Query Parameters

ParameterTypeRequiredDescription
pipeline_versionstringNoUse specific version

Response

200 OK

{
"message": "Event started successfully"
}

Behavior Description

  1. Deletes existing Knative service and source if present.
  2. Creates a new Knative service.
  3. Creates a Kafka source to subscribe to the specified topic.
  4. When a message arrives at the topic, the pipeline is executed.

Error Responses

Status CodeDescription
500 Internal Server ErrorPipeline does not have event option

GET /api/v1/events/{pipeline_id}

Retrieves the status of an event trigger.

Path Parameters

ParameterTypeRequiredDescription
pipeline_idstringYesPipeline ID

Response

200 OK

{
"ready": true
}
FieldTypeDescription
readybooleanKnative service ready status

DELETE /api/v1/events/{pipeline_id}

Stops and deletes the event trigger.

Path Parameters

ParameterTypeRequiredDescription
pipeline_idstringYesPipeline ID

Response

200 OK

{
"message": "Event deleted successfully"
}

Pipeline Configuration

To use event triggers, you must specify a Kafka topic in the pipeline options.

{
"id": "pipeline-abc123",
"name": "Event Pipeline",
"options": {
"event": "my-kafka-topic"
},
"steps": [...]
}

Architecture

┌──────────────┐     ┌─────────────────┐     ┌──────────────────┐
│ Kafka Topic │────>│ Knative Source │────>│ Knative Service │
│ (event) │ │ (KafkaSource) │ │ (Pipeline Runner)│
└──────────────┘ └─────────────────┘ └──────────────────┘
  1. Kafka Topic: Topic where event messages are published
  2. Knative Source: Subscribes to Kafka topic and delivers events
  3. Knative Service: Receives events and executes pipeline

Usage Examples

cURL

# Start event trigger
curl -X POST https://api.dhub.io/api/v1/events/pipeline-abc123 \
-H "Authorization: Bearer <access_token>"

# Get event status
curl https://api.dhub.io/api/v1/events/pipeline-abc123 \
-H "Authorization: Bearer <access_token>"

# Stop event trigger
curl -X DELETE https://api.dhub.io/api/v1/events/pipeline-abc123 \
-H "Authorization: Bearer <access_token>"

Event-Based Pipeline Workflow

// 1. Create pipeline (including event option)
await fetch('/api/v1/pipelines/', {
method: 'POST',
headers: {
'Authorization': `Bearer ${token}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
name: 'Order Processing',
options: {
event: 'orders-topic'
},
steps: [...]
})
});

// 2. Start event trigger
await fetch(`/api/v1/events/${pipelineId}`, {
method: 'POST',
headers: { 'Authorization': `Bearer ${token}` }
});

// 3. Check status
const status = await fetch(`/api/v1/events/${pipelineId}`, {
headers: { 'Authorization': `Bearer ${token}` }
}).then(res => res.json());

console.log('Event ready:', status.ready);

Scaling

Knative automatically scales based on traffic:

  • Scale to Zero: Instances scale down to 0 when there are no events
  • Auto-scaling: Automatic scaling based on event processing volume