Developer Reference

API Reference

Integrate Codiner's PAIKE engine directly into your own tools and workflows.

Authentication

Codiner uses standard Bearer tokens for cloud-based orchestration. Local-first requests skip authentication if executed on the same network node.

Example Header
Authorization: Bearer CODINER_API_KEY

Rate Limiting

Cloud tiers are restricted based on your neural credit balance. Local inference has zero rate limits, governed only by your hardware capabilities.

1,000
Free Tier / Mo
Unlimited
Local Node
50k+
Foundry Max

Error Response

401 Unauthorized
{
  "error": "invalid_api_key",
  "message": "The provided credentials do not exist in the foundry.",
  "requestId": "fd_9120asnk1"
}
Workspace

Project Init

POST/v1/workspace/init

Creates a new directory structure based on the specified template and neural profile.

Request BodyCopy JSON
{
  "name": "my-next-app",
  "template": "next-foundry-pro",
  "auth_provider": "clerk"
}

Parameters

name

The name of the resulting directory.

string
template

ID of the Codiner template to clone.

string
Workspace

Team Synch

GET/v1/workspace/synch/status

Retrieve the current neural alignment status of all active teammates on the project.

Parameters

projectId

Unique identifier for the project node.

id
Orchestration

Start Synch

POST/v1/orchestrate/synch

Triggers a deep neural scan of the target repository and populates the local vector store.

Request BodyCopy JSON
{
  "path": "./src",
  "engine": "llama-3-local",
  "options": {
    "depth": "ast-full",
    "indexing": "semantic"
  }
}

Parameters

path

Absolute file path to the project root.

string
engine

Choice of AI engine model.

enum
Orchestration

Apply Patch

PUT/v1/orchestrate/patch

Atomically applies neural-suggested code changes across multiple files while maintaining AST sanity.

Request BodyCopy JSON
{
  "patchId": "p_8812",
  "files": [
    "src/app/page.tsx",
    "src/components/Nav.tsx"
  ]
}

Parameters

patchId

The ID of the generated code improvement.

string
Neural Engine

Inference

POST/v1/neural/inference

Direct access to the underlying LLM with pre-loaded project context and PAIKE filters.

Request BodyCopy JSON
{
  "prompt": "Add a dark mode toggle to the Navbar component.",
  "context_window": 16384
}

Parameters

prompt

The natural language instruction.

string
context_window

Token limit for this specific query.

int
Neural Engine

Vector Ops

POST/v1/neural/vector/query

Perform semantic search within your project's neural index. Find logic, not keywords.

Request BodyCopy JSON
{
  "embedding": "authentication middleware logic",
  "top_k": 5
}

Parameters

embedding

The semantic Query string.

string
top_k

Number of relevant blocks to return.

int
Templates

List Templates

GET/v1/templates

Fetches all available AI-native project templates from the foundry gallery.

Parameters

category

Filter templates by category (e.g., 'web', 'mobile').

string
Security

Neural Audit

POST/v1/security/audit

Runs a multi-agent diagnostic on your code to identify logic flaws and potential zero-day exploits.

Request BodyCopy JSON
{
  "scope": "auth-layer",
  "strict_mode": true
}

Parameters

scope

Specific project path or logical layer to audit.

string
Security

Vulnerability Scan

GET/v1/security/scan/report

Retrieve the latest security report generated by the PAIKE engine.

Parameters

format

Report format: 'json' | 'pdf' | 'html'.

string
System

Health Check

GET/v1/system/health

Check the operational status of the PAIKE engine, local models, and vector stores.

Parameters

System

Telemetry

GET/v1/system/telemetry

Retrieve real-time token consumption, latency metrics, and hardware utilization stats.

Parameters

realtime

Whether to return live streaming telemetry.

boolean
Integrations

Webhooks

Subscribe to project events and receive real-time notifications about deployment status, neural scan completions, and security vulnerabilities.

Available Events

  • deployment.success
  • neural.scan.complete
  • security.alert.high
  • team.synch.error
Integrations

SDKs

JS
Node.js SDK
npm install @codiner/sdk
PY
Python SDK
pip install codiner-sdk
Resources

Architecture

Codiner operates via a hybrid local-proxy architecture. The PAIKE engine runs on your local node, while the cloud gateway handles team synchronization and credit management.

Local Node
PAIKE PROXY
Cloud Sync
Resources

Support

Discord

Join 5k+ developers in our neural foundry.

GitHub

Report issues and contribute to SDKs.

Twitter

Follow for neural engine updates.

Email

Direct support for Enterprise nodes.

Security & Privacy First

All API calls to `/v1/*` endpoints are processed through our proprietary local proxy if running in CLI mode. Your source code artifacts are never transmitted during orchestration unless cloud-inference is explicitly requested.