Skip to content
Get started

Cancel Batch Job

client.beta.batch.cancel(stringjobID, BatchCancelParams { organization_id, project_id, reason, temporalNamespace } params, RequestOptionsoptions?): BatchCancelResponse { job_id, message, processed_items, status }
POST/api/v1/beta/batch-processing/{job_id}/cancel

Cancel a running batch processing job.

Stops processing and marks all pending items as cancelled. Items currently being processed may complete depending on their state.

ParametersExpand Collapse
jobID: string
params: BatchCancelParams { organization_id, project_id, reason, temporalNamespace }
organization_id?: string | null

Query param

formatuuid
project_id?: string | null

Query param

formatuuid
reason?: string | null

Body param: Optional reason for cancelling the job

temporalNamespace?: string

Header param

ReturnsExpand Collapse
BatchCancelResponse { job_id, message, processed_items, status }

Response after cancelling a batch job.

job_id: string

ID of the cancelled job

message: string

Confirmation message

processed_items: number

Number of items processed before cancellation

status: "pending" | "running" | "dispatched" | 3 more

New status (should be 'cancelled')

Accepts one of the following:
"pending"
"running"
"dispatched"
"completed"
"failed"
"cancelled"

Cancel Batch Job

import LlamaCloud from '@llamaindex/llama-cloud';

const client = new LlamaCloud({
  apiKey: process.env['LLAMA_CLOUD_API_KEY'], // This is the default and can be omitted
});

const response = await client.beta.batch.cancel('job_id');

console.log(response.job_id);
{
  "job_id": "job_id",
  "message": "message",
  "processed_items": 0,
  "status": "pending"
}
Returns Examples
{
  "job_id": "job_id",
  "message": "message",
  "processed_items": 0,
  "status": "pending"
}