Skip to content
Get started

Data Sinks

List Data Sinks
client.dataSinks.list(DataSinkListParams { organization_id, project_id } query?, RequestOptionsoptions?): DataSinkListResponse { id, component, name, 4 more }
GET/api/v1/data-sinks
Create Data Sink
client.dataSinks.create(DataSinkCreateParams { component, name, sink_type, 2 more } params, RequestOptionsoptions?): DataSink { id, component, name, 4 more }
POST/api/v1/data-sinks
Get Data Sink
client.dataSinks.get(stringdataSinkID, RequestOptionsoptions?): DataSink { id, component, name, 4 more }
GET/api/v1/data-sinks/{data_sink_id}
Update Data Sink
client.dataSinks.update(stringdataSinkID, DataSinkUpdateParams { sink_type, component, name } body, RequestOptionsoptions?): DataSink { id, component, name, 4 more }
PUT/api/v1/data-sinks/{data_sink_id}
Delete Data Sink
client.dataSinks.delete(stringdataSinkID, RequestOptionsoptions?): void
DELETE/api/v1/data-sinks/{data_sink_id}
ModelsExpand Collapse
DataSink { id, component, name, 4 more }

Schema for a data sink.

id: string

Unique identifier

formatuuid
component: Record<string, unknown> | CloudPineconeVectorStore { api_key, index_name, class_name, 3 more } | CloudPostgresVectorStore { database, embed_dim, host, 10 more } | 5 more

Component that implements the data sink

Accepts one of the following:
Record<string, unknown>
CloudPineconeVectorStore { api_key, index_name, class_name, 3 more }

Cloud Pinecone Vector Store.

This class is used to store the configuration for a Pinecone vector store, so that it can be created and used in LlamaCloud.

Args: api_key (str): API key for authenticating with Pinecone index_name (str): name of the Pinecone index namespace (optional[str]): namespace to use in the Pinecone index insert_kwargs (optional[dict]): additional kwargs to pass during insertion

api_key: string

The API key for authenticating with Pinecone

formatpassword
index_name: string
class_name?: string
insert_kwargs?: Record<string, unknown> | null
namespace?: string | null
supports_nested_metadata_filters?: true
CloudPostgresVectorStore { database, embed_dim, host, 10 more }
database: string
embed_dim: number
host: string
password: string
port: number
schema_name: string
table_name: string
user: string
class_name?: string
hnsw_settings?: PgVectorHnswSettings { distance_method, ef_construction, ef_search, 2 more } | null

HNSW settings for PGVector.

distance_method?: "l2" | "ip" | "cosine" | 3 more

The distance method to use.

Accepts one of the following:
"l2"
"ip"
"cosine"
"l1"
"hamming"
"jaccard"
ef_construction?: number

The number of edges to use during the construction phase.

minimum1

The number of edges to use during the search phase.

minimum1
m?: number

The number of bi-directional links created for each new element.

minimum1
vector_type?: "vector" | "half_vec" | "bit" | "sparse_vec"

The type of vector to use.

Accepts one of the following:
"vector"
"half_vec"
"bit"
"sparse_vec"
perform_setup?: boolean
supports_nested_metadata_filters?: boolean
CloudQdrantVectorStore { api_key, collection_name, url, 4 more }

Cloud Qdrant Vector Store.

This class is used to store the configuration for a Qdrant vector store, so that it can be created and used in LlamaCloud.

Args: collection_name (str): name of the Qdrant collection url (str): url of the Qdrant instance api_key (str): API key for authenticating with Qdrant max_retries (int): maximum number of retries in case of a failure. Defaults to 3 client_kwargs (dict): additional kwargs to pass to the Qdrant client

api_key: string
collection_name: string
url: string
class_name?: string
client_kwargs?: Record<string, unknown>
max_retries?: number
supports_nested_metadata_filters?: true
CloudAzureAISearchVectorStore { search_service_api_key, search_service_endpoint, class_name, 8 more }

Cloud Azure AI Search Vector Store.

search_service_api_key: string
search_service_endpoint: string
class_name?: string
client_id?: string | null
client_secret?: string | null
embedding_dimension?: number | null
filterable_metadata_field_keys?: Record<string, unknown> | null
index_name?: string | null
search_service_api_version?: string | null
supports_nested_metadata_filters?: true
tenant_id?: string | null

Cloud MongoDB Atlas Vector Store.

This class is used to store the configuration for a MongoDB Atlas vector store, so that it can be created and used in LlamaCloud.

Args: mongodb_uri (str): URI for connecting to MongoDB Atlas db_name (str): name of the MongoDB database collection_name (str): name of the MongoDB collection vector_index_name (str): name of the MongoDB Atlas vector index fulltext_index_name (str): name of the MongoDB Atlas full-text index

CloudMilvusVectorStore { uri, token, class_name, 3 more }

Cloud Milvus Vector Store.

uri: string
token?: string | null
class_name?: string
collection_name?: string | null
embedding_dimension?: number | null
supports_nested_metadata_filters?: boolean
CloudAstraDBVectorStore { token, api_endpoint, collection_name, 4 more }

Cloud AstraDB Vector Store.

This class is used to store the configuration for an AstraDB vector store, so that it can be created and used in LlamaCloud.

Args: token (str): The Astra DB Application Token to use. api_endpoint (str): The Astra DB JSON API endpoint for your database. collection_name (str): Collection name to use. If not existing, it will be created. embedding_dimension (int): Length of the embedding vectors in use. keyspace (optional[str]): The keyspace to use. If not provided, 'default_keyspace'

token: string

The Astra DB Application Token to use

formatpassword
api_endpoint: string

The Astra DB JSON API endpoint for your database

collection_name: string

Collection name to use. If not existing, it will be created

embedding_dimension: number

Length of the embedding vectors in use

class_name?: string
keyspace?: string | null

The keyspace to use. If not provided, 'default_keyspace'

supports_nested_metadata_filters?: true
name: string

The name of the data sink.

project_id: string
sink_type: "PINECONE" | "POSTGRES" | "QDRANT" | 4 more
Accepts one of the following:
"PINECONE"
"POSTGRES"
"QDRANT"
"AZUREAI_SEARCH"
"MONGODB_ATLAS"
"MILVUS"
"ASTRA_DB"
created_at?: string | null

Creation datetime

formatdate-time
updated_at?: string | null

Update datetime

formatdate-time