Chaprola — The Agent-First Data Platform with MCP Integration

Chaprola is the agent-first data platform built for AI agents and the Model Context Protocol (MCP). A structured, serverless data store operated through plain HTTP calls — no drivers, no ORMs, no infrastructure. One command wires any MCP-enabled agent into every Chaprola endpoint.

An agent defines a schema by sending sample JSON, writes a short program (19 commands total), compiles it to bytecode, and executes it — all through REST endpoints. Proven on 1,050,000 records. 100M instruction budget per execution. O(1) field access via fixed-record memory model. Every registered user gets a @chaprola.org email address.

Model Context Protocol (MCP) integration

Chaprola is published on the Model Context Protocol (MCP) registry. One command installs the Chaprola MCP server into any MCP-enabled client:

claude mcp add chaprola -- npx @anthropic-ai/chaprola-mcp@latest

After that command runs, your agent has direct access to every Chaprola endpoint through its native MCP context: import JSON, compile programs, run reports, query data, send email from an @chaprola.org address, and manage files. No glue code, no SDK, no wrapper. The agent speaks MCP; Chaprola answers in MCP.

Chaprola is designed for MCP from the ground up — it is not a database retrofitted for AI agents. The 19-command language is small enough for any LLM to learn in a single context window, so the agent can author and run its own programs without external tooling.

Scale

Chaprola is not a toy. The VM processes 1,050,000 NYC taxi trip records in under 3 seconds. Each Lambda runs with 10 GB memory — a 44-character record supports ~103 million rows, a 500-character record supports ~9 million rows. The fixed-record memory model delivers constant-time field access regardless of dataset size — no query planning, no indexing overhead. The 19-command language is small by design: small enough for any LLM to learn in one context window, powerful enough to sort, filter, aggregate, and report across millions of rows.

Quick Start

API Base URL: https://api.chaprola.org

  1. Register: POST /register with {"username":"my-agent","passcode":"my-secret","never_expires":true} — returns a JWT token
  2. Import data: POST /import with JSON array — creates .F (format) and .DA (data) files
  3. Compile: POST /compile with Chaprola source code — produces .PR bytecode
  4. Run: POST /run — executes the bytecode program
  5. Publish: POST /publish — makes the program publicly accessible
  6. Report: POST /report — run a published program (no auth required)
  7. Send email: POST /email/send — send from your @chaprola.org address

All Endpoints (19 total)

MethodPathAuthDescription
GET/helloNoHealth check
POST/registerNoCreate account, get JWT token
POST/loginNoLogin, get new token
POST/check-usernameNoCheck username availability
POST/delete-accountNoDelete account + all data
POST/importYesImport JSON into Chaprola format
POST/import-urlYesGet presigned S3 upload URL
POST/import-processYesProcess staged S3 file
POST/exportYesExport Chaprola data to JSON
POST/listYesList files by userid/project
POST/compileYesCompile .CS source to .PR bytecode
POST/runYesExecute .PR bytecode program
POST/publishYesMake program publicly accessible
POST/unpublishYesRemove public access
POST/reportNoRun a published program
POST/email/inboundNoInbound webhook (Resend)
POST/email/inboxYesList emails in mailbox
POST/email/readYesRead full email by ID
POST/email/sendYesSend email from @chaprola.org
POST/email/deleteYesDelete email by ID

Authentication

Register or login to get a JWT token. Include these headers on authenticated requests:

Tokens last 90 days by default, or set "never_expires": true at registration.

The Language

Chaprola is a fixed-record data processing language with 19 commands: OPEN, CLOSE, GET, PUT, MOVE, ADD, SUB, MPY, DIV, IF, GOTO, FIND, DEFINE, PRINT, SIZE, SORT, COUNT, INDEX, END. Programs compile to bytecode and execute in a VM with constant-time field access.

Links

Originally designed by John H. Letcher, Professor of Computer Science, University of Tulsa. Updated for 2026 by Charles Letcher.