Data Stores API
The Data Stores API provides endpoints for managing DuckDB databases, tables, queries, and dashboards. All endpoints require authentication via JWT token or API key.
Base URL
https://api.vertesia.io/api/v1/data
Store Operations
List Data Stores
GET /data
Returns all data stores in the current project.
Response:
[
{
"id": "store_abc123",
"name": "sales_analytics",
"description": "Sales data warehouse",
"status": "ACTIVE",
"schema_version": 5,
"table_count": 3,
"total_rows": 150000,
"storage_bytes": 25600000,
"created": "2024-01-15T10:00:00Z",
"updated": "2024-01-20T15:30:00Z"
}
]
Create Data Store
POST /data
Request Body:
{
"name": "sales_analytics",
"description": "Sales data warehouse",
"tags": ["analytics", "sales"]
}
Response: Created DataStore object with ID.
Get Data Store
GET /data/:storeId
Returns detailed information about a data store including full schema.
Delete Data Store
DELETE /data/:storeId
Archives a data store (soft delete).
Schema Operations
Get Schema
GET /data/:storeId/schema
Query Parameters:
format-'full'for complete details,'ai'for AI-optimized summary
Response (format=ai):
{
"name": "sales_analytics",
"version": 5,
"tables": {
"customers": {
"description": "Customer master data",
"columns": {
"id": {"type": "INTEGER", "primary_key": true},
"email": {"type": "STRING", "semantic_type": "email"},
"name": {"type": "STRING"}
}
}
},
"relationships": [
{"from": "orders.customer_id", "to": "customers.id", "type": "many-to-one"}
]
}
Update Schema
PUT /data/:storeId/schema
Bulk update schema with AI-driven changes.
Request Body:
{
"description": "Updated description",
"tables": [...],
"relationships": [...],
"drop_tables": ["old_table"],
"message": "Schema update for Q1 reporting"
}
Get Schema History
GET /data/:storeId/schema/history
Returns version history of schema changes.
Table Operations
List Tables
GET /data/:storeId/tables
Response:
[
{
"name": "customers",
"description": "Customer master data",
"column_count": 5,
"row_count": 10000
}
]
Create Tables
POST /data/:storeId/tables
Creates multiple tables atomically.
Request Body:
{
"tables": [
{
"name": "customers",
"description": "Customer information",
"columns": [
{"name": "id", "type": "INTEGER", "primary_key": true},
{"name": "email", "type": "STRING", "semantic_type": "email", "unique": true},
{"name": "name", "type": "STRING", "nullable": false}
]
}
],
"message": "Create customer table"
}
Column Properties:
name(required) - Column nametype(required) - Data type: STRING, INTEGER, BIGINT, FLOAT, DOUBLE, DECIMAL, BOOLEAN, DATE, TIMESTAMP, JSONdescription- Column descriptionnullable- Allow NULL values (default: true)primary_key- Is primary keyunique- Enforce uniquenessdefault- Default value expressionsemantic_type- AI hint: email, phone, url, currency, percentage, person_name, address, country, date_iso, identifier
Get Table
GET /data/:storeId/tables/:tableName
Query Parameters:
sample-'true'to include sample rows
Alter Table
PUT /data/:storeId/tables/:tableName
Request Body:
{
"add_columns": [
{"name": "new_col", "type": "STRING"}
],
"drop_columns": ["old_col"],
"rename_columns": [
{"from": "col1", "to": "column_one"}
],
"modify_columns": [
{"name": "amount", "type": "DECIMAL"}
]
}
Drop Table
DELETE /data/:storeId/tables/:tableName
Import Operations
Import Data
POST /data/:storeId/import
Imports data atomically into one or more tables.
Request Body:
{
"tables": {
"customers": {
"source": "inline",
"data": [
{"id": 1, "name": "Alice", "email": "alice@example.com"}
]
},
"orders": {
"source": "url",
"uri": "https://example.com/orders.csv",
"format": "csv"
}
},
"mode": "append",
"message": "Monthly data import"
}
Source Types:
inline- Data indatafieldgcs- Google Cloud Storage (gs://bucket/path)url- HTTPS URLartifact- Sandbox artifact (out/filename.csv)
Modes:
append- Add to existing data (default)replace- Clear table first
Response:
{
"id": "import_xyz",
"status": "COMPLETED",
"rows_imported": {"customers": 100, "orders": 500},
"version_id": "v_123"
}
Get Import Status
GET /data/:storeId/import/:importId
Query Operations
Execute Query
POST /data/:storeId/query
Request Body:
{
"sql": "SELECT * FROM customers WHERE created_at > {{start_date}}",
"params": {"start_date": "2024-01-01"},
"limit": 100
}
Response:
{
"columns": ["id", "name", "email"],
"rows": [
[1, "Alice", "alice@example.com"],
[2, "Bob", "bob@example.com"]
],
"row_count": 2,
"execution_time_ms": 45
}
Validate Query
POST /data/:storeId/query/validate
Validates SQL syntax and schema references without executing.
Request Body:
{
"queries": [
"SELECT * FROM customers",
"SELECT * FROM nonexistent_table"
]
}
Response:
{
"valid": false,
"errors": [
{"query": "SELECT * FROM nonexistent_table", "error": "Table 'nonexistent_table' not found"}
]
}
Batch Query
POST /data/:storeId/query/batch
Execute multiple queries in one request (max 20).
Version Operations
List Versions
GET /data/:storeId/versions
Query Parameters:
snapshots_only- Only return named snapshotslimit- Max versions to return (default: 50)
Response:
[
{
"id": "v_abc123",
"message": "Schema update",
"schema_version": 5,
"is_snapshot": true,
"snapshot_name": "pre-migration",
"created_at": "2024-01-20T10:00:00Z"
}
]
Create Snapshot
POST /data/:storeId/versions
Request Body:
{
"name": "pre-migration-backup",
"message": "Snapshot before schema migration"
}
Rollback to Version
POST /data/:storeId/versions/:versionId/rollback
Restores the data store to a previous version.
Query Specific Version
POST /data/:storeId/versions/:versionId/query
Query data from a specific version without rolling back.
Dashboard Operations
List Dashboards
GET /data/:storeId/dashboards
Query Parameters:
status- Filter by'active'or'archived'
Create Dashboard
POST /data/:storeId/dashboards
Request Body:
{
"name": "Sales Overview",
"description": "Monthly sales metrics",
"query": "SELECT month, SUM(amount) as revenue FROM orders GROUP BY month",
"spec": {
"mark": "bar",
"encoding": {
"x": {"field": "month", "type": "temporal"},
"y": {"field": "revenue", "type": "quantitative"}
}
},
"queryLimit": 10000,
"queryParameters": {"year": "2024"}
}
Get Dashboard
GET /data/:storeId/dashboards/:dashboardId
Update Dashboard
PUT /data/:storeId/dashboards/:dashboardId
Delete Dashboard
DELETE /data/:storeId/dashboards/:dashboardId
Archives the dashboard.
List Dashboard Versions
GET /data/:storeId/dashboards/:dashboardId/versions
Create Dashboard Snapshot
POST /data/:storeId/dashboards/:dashboardId/versions
Request Body:
{
"name": "q1-final",
"message": "Finalized Q1 dashboard"
}
Promote Dashboard Version
POST /data/:storeId/dashboards/:dashboardId/versions/:versionId/promote
Restores a specific version as the current dashboard.
Set Dashboard Versioning
PUT /data/:storeId/dashboards/:dashboardId/versioning
Request Body:
{
"enabled": true
}
Download Operations
Get Download URL
GET /data/:storeId/download
Query Parameters:
version_id- Optional specific version
Response:
{
"url": "https://storage.googleapis.com/...",
"expires_in": 900,
"store_id": "store_abc",
"store_name": "sales_analytics",
"schema_version": 5
}
TypeScript Client
The Vertesia client provides typed methods for all operations:
import { VertesiaClient } from '@vertesia/client';
const client = new VertesiaClient({ apiKey: 'your-api-key' });
const dataApi = client.data;
// Store operations
const stores = await dataApi.list();
const store = await dataApi.create({ name: 'my_store' });
const schema = await dataApi.getSchema(store.id, 'ai');
// Table operations
const tables = await dataApi.listTables(store.id);
await dataApi.createTables(store.id, { tables: [...] });
// Query operations
const result = await dataApi.query(store.id, { sql: 'SELECT * FROM users' });
// Dashboard operations
const dashboardApi = dataApi.dashboards(store.id);
const dashboards = await dashboardApi.list();
const dashboard = await dashboardApi.create({ name: 'Overview', query: '...', spec: {...} });
Error Codes
| Code | Description |
|---|---|
| 400 | Invalid request (bad SQL, schema validation failed) |
| 404 | Store, table, or dashboard not found |
| 409 | Conflict (duplicate name, constraint violation) |
| 422 | Unprocessable (import failed, type conversion error) |
| 500 | Internal error |
