Quickstart Guide
Tonk follows a simple two-mode workflow:
- Workers stream data in - Background services connect to external APIs or your file system and stream data into your local Tonk store
- Apps visualise data - Frontend applications provide interfaces to explore and interact with that data
This architecture separates data ingestion from visualisation, making your applications more maintainable and your data more reusable across different interfaces.
Installing Tonk
First, you'll need to install Tonk on your machine:
npm install -g @tonk/cli && tonk hello
This will install the Tonk CLI globally and run the hello
command, which sets up the Tonk daemon for synchronising your data.
If you encounter issues at this stage, see the troubleshooting guide at the bottom of the page.
The Tonk Workflow
Mode 1: Create Workers (Data Ingestion)
Workers are background services that connect to the outside world and stream data into your Tonk store. Start by creating a worker:
tonk create # choose 'worker' when prompted
cd my-worker
Workers handle tasks like:
- Syncing data from Google Maps, Gmail, or other APIs
- Processing scheduled tasks
- Real-time data streaming
- API integrations
Example worker structure:
// src/index.ts - Your worker's main logic
app.post('/tonk', async (req, res) => {
// Connect to external API
const data = await fetchFromExternalAPI();
// Store in Tonk via keepsync
await writeDoc('my-collection/data', data);
res.json({ success: true });
});
Mode 2: Create Apps (Data Visualisation)
Once you have data flowing in via workers, create frontend apps to visualise and interact with that data:
tonk create # choose 'react' when prompted
cd my-app
Apps are React applications that:
- Connect to your Tonk stores
- Provide interfaces for your data
- Enable real-time collaboration
- Work offline-first
The CLI will scaffold a project with:
- React, TypeScript, and Tailwind CSS
- Keepsync for accessing your data stores
- Development tools
Development Workflow
Start Your Worker
First, get your worker running to begin data ingestion:
cd my-worker
pnpm dev
Register and start the worker:
tonk worker register
tonk worker start my-worker
Start Your App
Then start your frontend app and navigate to http://localhost:3000 in the browser:
cd my-app
pnpm dev
This will:
- Start a development server with hot reloading (so changes in the code are instantly reflected)
- Connect to your Tonk stores (where workers are streaming data)
Understanding the Data Flow
The magic happens through Tonk Stores - shared data containers that connect workers and apps:
External APIs → Workers → Tonk Stores → Apps → Users
↑ ↓
└ Real-time sync ┘
In Workers: Writing Data
Workers stream data into stores using keepsync
:
import { writeDoc } from '@tonk/keepsync';
// Worker streams in location data
await writeDoc('locations/favorites', {
places: googleMapsData,
lastSync: Date.now()
});
In Apps: Sculpting Data
Apps connect to these stores allowing you to perform sophisticated actions over your data.
import { create } from "zustand";
import { sync } from "@tonk/keepsync";
// App reads and displays the same data
const useLocationStore = create(
sync(
(set) => ({
locations: [],
// ... your app logic
}),
{ docId: "locations/favorites" }
)
);
Key Benefits
- Separation of concerns: Workers handle data, apps handle logic and rendering
- Real-time sync: Changes appear instantly across all connected apps
- Offline-first: Everything works without internet, syncs when reconnected
- No database complexity: No migrations, caching, or auth headaches
- Collaborative: Multiple users see updates in real-time
Deployment Options
Tonk provides several ways to deploy your workers and apps:
Local Deployment
Deploy in one command:
# Build, push, and start your app in one step
tonk push
Or for more control:
# Skip building (if already built)
tonk push --no-build
# Upload only (don't start automatically)
tonk push --no-start
# Then start manually later
tonk start <bundleName> --route /<route>
One-Touch Hosting (Experimental)
For quick deployment to the cloud:
tonk deploy
⚠️ Note: This is experimental and requires an access code. Contact Tonk support for access.
Docker & Production
For production deployments, Tonk includes Docker support and cloud deployment options.
Next Steps
- Learn Workers: Tonk Workers Guide - Create background services
- Learn Keepsync: Keepsync Guide - Master data synchronization
- Deploy to Production: Deployment Guide - Docker, cloud, and hosting options
- CLI Reference: Command Reference - Complete command documentation
Real-World Examples
Check out these complete examples in the repository:
- Google Maps Locations Worker - Syncs your saved places from Google Maps
- My World App - Visualizes location data on an interactive map
- File Browser App - Browse and manage files with real-time sync
Troubleshooting
Permission Denied Error During Installation
If you encounter a permission denied error when installing the Tonk CLI globally:
npm install -g @tonk/cli
# Error: EACCES: permission denied, mkdir '/usr/local/lib/node_modules'
This is a common npm issue on Unix systems. Here are several solutions:
Option 1: Fix npm permissions
sudo chown -R $(whoami) $(npm config get prefix)/{lib/node_modules,bin,share}
Option 2: Configure npm to use a different directory
mkdir ~/.npm-global
npm config set prefix '~/.npm-global'
echo 'export PATH="$HOME/.npm-global/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc
npm install -g @tonk/cli
Vibe Coding with Tonk
What is Vibe Coding?
Vibe coding is a novel approach to software development where a person uses natural language to describe tasks to an LLM which generates the code. The person is primarily responsible for guiding the LLM and imparting a vision. Vibe coding is about building momentum, following creative instincts, and evolving a project through rapid iteration and feedback loops.
Essential Tools
Graphical Software
- Cursor - fork of Visual Studio Code, with deep AI integration
- Windsurf - another VS Code fork, more streamlined and beginner-friendly
Terminal
- Claude Code - highly capable and intuitive CLI-based agentic coding tool
Models
- Claude 3.7 Sonnet or Claude 4 Sonnet recommended for most tasks
- Claude 4 Opus or OpenAI o3 for complicated reasoning tasks and planning
Core Principles
1. Begin with a plan
Using your favourite vibe coding IDE or chat interface, work out a detailed plan of what you want to achieve with your project. It's best to include features, views, technical details, constraints, and visual preferences. If you're unsure about any specifics, have the LLM go with its best judgement. You can always refine your project later.
Once you're happy with your plan, save it to a markdown file in your project, so the LLM can reference it later. Then, open your agentic editor to your Tonk project and prompt the LLM with the first part of your plan.
2. Retain control
Vibe coding at its best can fully abstract the code from you. It should never abstract intent. LLMs can only implement what you can articulate, so make sure you can reason about what you're asking of it. Vibe coding fails hard and fast when you lose understanding of what the LLM is doing at a high level.
Break tasks into small steps you know how to test. When something goes wrong, specifically articulate what you wanted, what you observed, and what you want to change. If there are errors in the browser, code, or terminal, make sure to pass them on to the LLM.
3. Git as fail-safe
Each time you complete a milestone, commit the changes to git. This allows you to quickly see what changes the LLM has made since the last commit, compare various stages of your project, and try multiple iterations of a feature without risking stable parts of your code.
4. Ask AI for help
If you're not sure what you want or how to get there, be as clear as you can to the LLM and ask for some options without writing any code. Weigh the options then ask it to go through with one. If you want to try multiple, commit your changes to git first. Then, you can change the code as much as you like and revert when necessary.
You can also ask the LLM to explain any part of the code to you. If you feel yourself losing grip of the project's intent, don't be afraid to dig into the code and poke around.
Where Tonk Comes In
Tonk is designed specifically to enable vibe coding workflows. Here's how Tonk supports the vibe coding approach:
Streamlined Tech Stack
Tonk's architecture, tooling, and developer experience are tailored for easy use by LLMs. The entire backend is abstracted, letting the AI focus on what it does best: React frontends. We use Vite, Tailwind, and Zustand, which are favoured by agentic tooling and provide the optimal balance between convenience and extensibility.
Tonk provides sync
middleware for Zustand stores so that all state saved to a store is automatically persisted and synced between connected devices around the world.
A backend server is automatically generated so you can query external services from your React app.
Exploratory Development
- Rapid prototyping - spin up new ideas in minutes, not hours
- Easy experimentation - try different approaches without fear of breaking things
- Continuous data - access any all of your data across your apps, synced live
- Seamless scaling - move from prototype to production without architectural rewrites
What's Achievable with Tonk
Perfect For
- Real-time applications - chat apps, collaborative tools, live dashboards
- Exploratory projects - when you're not sure what you're building yet
- Idiosyncratic expression - for a unique and creative web
- Learning and experimentation - try new ideas quickly
- Small to medium applications - full-stack apps with 1-1000 users
- Hackathons and time-boxed projects - maximum velocity development
Coming Soon
- Enterprise applications - greater security guarantees and reliability
- User space - tying users to data
- Identity, authentication, permissions - gate access and share with peace of mind
Don't Use For
- Mission-critical systems
- Performance-critical applications
- Massive scale
Resources
For the practical: https://github.com/techiediaries/awesome-vibe-coding?tab=readme-ov-file.
For the poetic: https://www.thewayofcode.com/
Keepsync
Keepsync is our local-first sync engine that provides real-time collaboration and local-first data management for Tonk applications. It uses Automerge CRDTs under the hood to enable automatic conflict resolution when multiple users edit the same data, and when working offline.
Documents and Stores
Keepsync supports two main ways to work with data:
- Synced Stores: Zustand stores enhanced with real-time synchronisation using the
sync
middleware - Direct Document Access: File-system-like access to individual documents using path-based addressing
Documents are uniquely identified by a docId
and automatically reconcile state between all clients
connected to the same server.
Basic Usage
1. Set Up the Sync Engine
Initialise the sync engine in your application entry point (this is automatically included when you create a Tonk app):
// index.tsx
import { configureSyncEngine } from '@tonk/keepsync';
import { BrowserWebSocketClientAdapter } from '@automerge/automerge-repo-network-websocket';
import { IndexedDBStorageAdapter } from '@automerge/automerge-repo-storage-indexeddb';
const wsProtocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
const httpProtocol = window.location.protocol === 'https:' ? 'https://' : 'http://';
const wsUrl = `${wsProtocol}//${window.location.host}/sync`;
const wsAdapter = new BrowserWebSocketClientAdapter(wsUrl);
const storage = new IndexedDBStorageAdapter();
await configureSyncEngine({
url: `${httpProtocol}//${window.location.host}`,
network: [wsAdapter as any],
storage,
});
2. Create a Synced Store with the Middleware
Use the sync
middleware to create stores that automatically synchronise with other clients:
// stores/counterStore.ts
import { create } from 'zustand';
import { sync } from '@tonk/keepsync';
interface CounterState {
count: number;
increment: () => void;
decrement: () => void;
reset: () => void;
}
export const useCounterStore = create<CounterState>(
sync(
// The store implementation
set => ({
count: 0,
// Increment the counter
increment: () => {
set(state => ({ count: state.count + 1 }));
},
// Decrement the counter
decrement: () => {
set(state => ({ count: Math.max(0, state.count - 1) }));
},
// Reset the counter
reset: () => {
set({ count: 0 });
},
}),
// Sync configuration
{
docId: 'counter',
// Optional: configure initialisation timeout (default: 30000ms)
initTimeout: 30000,
// Optional: handle initialisation errors
onInitError: error => console.error('Sync initialisation error:', error),
}
)
);
3. Use the Store in React Components
// components/Counter.tsx
import React from "react";
import { useCounterStore } from "../stores/counterStore";
export function Counter() {
// Use the store hook directly - sync is handled by the middleware
const { count, increment, decrement, reset } = useCounterStore();
return (
<div>
<h2>Collaborative Counter: {count}</h2>
<div>
<button onClick={decrement}>-</button>
<button onClick={increment}>+</button>
<button onClick={reset}>Reset</button>
</div>
<p>
<small>
Open this app in multiple windows to see real-time collaboration in
action.
</small>
</p>
</div>
);
}
Direct Document Access
For scenarios where you need more fine-grained control over document access, when working outside of React, or when a Zustand store is too heavyweight, you can work directly with documents using filesystem-like paths.
Reading and Writing Documents
import { readDoc, writeDoc } from '@tonk/keepsync';
// Read a document
const userData = await readDoc<{ name: string; email: string }>('users/john');
console.log(userData); // { name: "John Doe", email: "john@example.com" } or undefined
// Write a document
await writeDoc('users/john', {
name: 'John Doe',
email: 'john@example.com',
lastLogin: new Date().toISOString(),
});
// Update an existing document
const currentData = await readDoc('users/john');
if (currentData) {
await writeDoc('users/john', {
...currentData,
lastLogin: new Date().toISOString(),
});
}
Listening to Document Changes
You can listen for changes to specific documents without using the full sync middleware:
import { listenToDoc } from '@tonk/keepsync';
// Attach a listener to a document
const removeListener = await listenToDoc('users/john', doc => {
console.log('User document changed:', doc);
// Update UI or trigger other side effects
});
// Later, when you want to stop listening
removeListener();
File System Operations
Keepsync provides filesystem-like operations for organising your documents:
import { ls, mkDir, rm } from '@tonk/keepsync';
// List contents of a directory
const contents = await ls('users');
console.log(contents); // Returns DocNode with children array
// Create a directory structure
await mkDir('projects/tonk-app/data');
// Remove a document or directory (recursively)
const success = await rm('users/inactive-user');
console.log(success); // true if removed successfully
Advanced Features
Document Types and Structure
Keepsync organises documents in a hierarchical structure similar to a filesystem:
import type { DocNode, DirNode, RefNode } from '@tonk/keepsync';
// DocNode: Represents a document or directory
interface DocNode {
type: 'doc' | 'dir';
pointer?: DocumentId;
name: string;
timestamps: {
create: number;
modified: number;
};
children?: RefNode[];
}
// DirNode: Represents a directory
interface DirNode {
type: 'dir';
name: string;
timestamps: {
create: number;
modified: number;
};
children?: RefNode[];
}
// RefNode: Reference to a document or directory
interface RefNode {
pointer: DocumentId;
type: 'doc' | 'dir';
timestamps: {
create: number;
modified: number;
};
name: string;
}
Error Handling
import { readDoc, writeDoc } from '@tonk/keepsync';
try {
const data = await readDoc('some/path');
if (!data) {
console.log('Document not found');
}
} catch (error) {
console.error('Sync engine not initialised:', error);
}
// Handle sync initialisation errors in stores
const useMyStore = create(
sync(
set => ({
/* store definition */
}),
{
docId: 'my-store',
onInitError: error => {
// Handle initialisation failures
console.error('Failed to initialise sync:', error);
// Could show user notification, retry logic, etc.
},
}
)
);
Best Practices
-
Use meaningful document paths: Organise your data logically using clear, hierarchical paths like
users/profiles/john
orprojects/my-app/settings
. -
Handle initialisation gracefully: Always provide
onInitError
callbacks for sync middleware to handle network or initialisation issues. -
Choose the right tool: Use synced stores for application state that needs real-time collaboration, and direct document access for more structured data or when you need filesystem-like operations.
-
Clean up listeners: Always call the cleanup function returned by
listenToDoc
when components unmount or when listeners are no longer needed. -
Path conventions: Use forward slashes (
/
) for path separators and avoid starting paths with/
(they will be normalised automatically).
API Reference
Sync Middleware
sync<T>(config: StateCreator<T>, options: SyncOptions): StateCreator<T>
- Creates a synced Zustand store
Document Operations
readDoc<T>(path: string): Promise<T | undefined>
- Read a documentwriteDoc<T>(path: string, content: T): Promise<void>
- Write/update a documentlistenToDoc<T>(path: string, listener: (doc: T) => void): Promise<() => void>
- Listen for document changes
Filesystem Operations
ls(path: string): Promise<DocNode | undefined>
- List directory contentsmkDir(path: string): Promise<DirNode | undefined>
- Create directory structurerm(path: string): Promise<boolean>
- Remove document or directory
Configuration
configureSyncEngine(options: SyncEngineOptions): SyncEngine
- Initialise the sync enginegetSyncEngine(): SyncEngine | null
- Get the current sync engine instance
Tonk Workers
Tonk Workers are background services that extend your Tonk applications with additional functionality. They run as separate processes and integrate seamlessly with the Tonk ecosystem through a standardized API.
Overview
Workers provide specialized functionality like:
- Data synchronisation with external services
- Scheduled background tasks
- API integrations
- Real-time data processing
Architecture
Workers are standalone Node.js applications that:
- Run on their own ports
- Communicate via HTTP/WebSocket
- Integrate with Tonk's sync system (
keepsync
) - Are managed by the Tonk CLI
Creating a Worker
1. Initialize a Worker
tonk create # choose 'worker' when prompted
cd new-worker
2. Worker Structure
// src/index.ts
import express from 'express';
import cors from 'cors';
import { configureSyncEngine } from './sync.js';
const app = express();
const PORT = process.env.PORT || 5555;
app.use(cors());
app.use(express.json());
// Health check endpoint (required)
app.get('/health', (req, res) => {
res.json({ status: 'ok' });
});
// Main worker endpoint
app.post('/tonk', async (req, res) => {
try {
// Your worker logic here
const result = await processRequest(req.body);
res.json({ success: true, data: result });
} catch (error) {
res.status(500).json({ error: error.message });
}
});
app.listen(PORT, () => {
console.log(`Worker running on port ${PORT}`);
});
3. Worker Configuration
Create a worker.config.js
file:
module.exports = {
runtime: {
port: 5555,
healthCheck: {
endpoint: '/health',
method: 'GET',
interval: 30000,
timeout: 5000,
},
},
process: {
file: 'index.js',
cwd: './dist',
instances: 1,
autorestart: true,
env: {
NODE_ENV: 'production',
},
},
schemas: {
documents: {
default: {},
},
},
};
Managing Workers
Register a Worker
tonk worker register
List Workers
tonk worker ls
Start/Stop Workers
tonk worker start my-worker
tonk worker stop my-worker
Check Worker Status
tonk worker ping my-worker
tonk worker inspect my-worker
View Logs
tonk worker logs my-worker
keepsync
Integration
Workers can read and write to Tonk's sync system:
import { configureSyncEngine, readDoc, writeDoc } from './sync.js';
// Configure sync engine
const engine = await configureSyncEngine({
url: SYNC_URL,
network: [wsAdapter as any as NetworkAdapterInterface],
storage: new NodeFSStorageAdapter(),
});
// Read data
const data = await readDoc('my-collection/document-id');
// Write data
await writeDoc('my-collection/document-id', {
timestamp: Date.now(),
data: processedData,
});
Example: Google Maps Locations Worker
The codebase includes a complete example worker that:
- Connects to Google Maps API
- Exports saved locations daily
- Stores data in
keepsync
- Provides CLI commands for setup
Key features:
- OAuth 2.0 authentication
- Scheduled exports (cron-like)
Standard Endpoints
Workers should implement these standard endpoints:
GET /health
- Health check (required)POST /tonk
- Main processing endpointGET /status
- Worker status information- Custom endpoints for specific functionality
Best Practices
Error Handling
// Global error handlers
process.on('uncaughtException', error => {
console.error('Uncaught Exception:', error);
process.exit(1);
});
process.on('unhandledRejection', reason => {
console.error('Unhandled Rejection:', reason);
process.exit(1);
});
// Graceful shutdown
process.on('SIGINT', () => {
console.log('Shutting down gracefully...');
process.exit(0);
});
Environment Configuration
// Use environment variables for configuration
const config = {
port: process.env.PORT || 5555,
syncUrl: process.env.SYNC_URL || 'ws://localhost:7777',
apiKey: process.env.API_KEY,
};
Logging
// Structured logging
console.log(
JSON.stringify({
timestamp: new Date().toISOString(),
level: 'info',
message: 'Worker started',
port: PORT,
})
);
Deployment
Workers are deployed separately from Tonk applications but work great together.
- Development: Workers run locally via
tonk worker start
- Docker: Include workers in your docker-compose.yml
Troubleshooting
Worker Not Starting
# Check worker status
tonk worker inspect my-worker
# View logs
tonk worker logs my-worker
# Check port conflicts
lsof -i :5555
Health Check Failures
Ensure your worker responds to GET /health
with:
{ "status": "ok" }
Sync Issues
- Verify
SYNC_URL
environment variable - Check network connectivity to Tonk server
- Ensure proper
keepsync
configuration
Next Steps
- Learn about Keepsync for data synchronization
- Check out deployment strategies for production
Deploying Tonk Apps
This guide covers various deployment strategies for Tonk applications, from local containerization to cloud deployment on platforms like AWS EC2.
Overview
Tonk applications can be deployed in several ways:
- Local Development: Using
tonk -d
daemon for development - Docker Containerization: Packaging apps and the Tonk server in containers
- One-Touch Hosting: Using
tonk deploy
for managed hosting (experimental)
Docker Deployment
Tonk provides built-in Docker support for both the Tonk server and individual applications.
Tonk Server
The Tonk server is available as a pre-built Docker image at tonklabs/tonk-server:latest
. Pull and run the image with:
docker run -d \
--name tonk-server \
-p 7777:7777 \
-v tonk-data:/data/tonk \
tonklabs/tonk-server:latest
Tonk Apps
When you create a Tonk app using tonk create
, a docker-compose.yml
file is automatically included in your project. This file is pre-configured to work with your app.
-
Build your Tonk app:
cd my-tonk-app pnpm run build
-
Start the containers using the included configuration:
docker-compose up -d
-
Access your app:
- Tonk server: http://localhost:7777
- Your app: http://localhost:8000
Customizing Your Docker Setup
You can customize the included docker-compose.yml
file for your specific needs:
services:
tonk-server:
image: tonklabs/tonk-server:latest
container_name: tonk-server
volumes:
- tonk-data:/data/tonk/stores
- tonk-bundles:/data/tonk/bundles
- ./dist:/tmp/app-bundle
ports:
- "7777:7777"
- "8000:8000"
environment:
- PORT=7777
- NODE_ENV=production
- VERBOSE=false # Disable verbose logging for production
- SYNC_INTERVAL=30000 # Set sync interval to 30 seconds
restart: unless-stopped
# The command section handles app deployment automatically
Environment Configuration
The Tonk server Docker image supports several environment variables:
Variable | Default | Description |
---|---|---|
PORT | 7777 | Port for the Tonk server |
BUNDLES_PATH | /data/tonk/bundles | Directory for storing app bundles |
STORES_PATH | /data/tonk/stores | Directory for storing data |
CONFIG_PATH | /data/tonk/config | Directory for configuration files |
VERBOSE | true | Enable verbose logging |
SYNC_INTERVAL | 0 | Sync interval in milliseconds |
NODE_ENV | production | Node.js environment |
Troubleshooting
Common Issues
- Port conflicts: Ensure ports 7777 and 8000 are available
- Permission issues: Check file permissions for data directories
- Network connectivity: Verify security group settings
- Resource limits: Monitor CPU and memory usage
Debugging Commands
# Check container logs
docker logs tonk-server
# Check container status
docker ps
# Check Tonk server health
curl http://localhost:7777/ping
# Check running bundles
tonk ps
One-Touch Hosting (Experimental)
⚠️ EXPERIMENTAL FEATURE ⚠️
The tonk deploy
command provides experimental one-touch hosting for Tonk applications. This feature is highly experimental and will change drastically. Expect data loss. You must obtain an access code to use this feature (see below).
Usage
In your Tonk app directory run:
tonk deploy
Options:
-n, --name <name>
: Name for the deployed app (defaults to package.json name)-r, --region <region>
: Region to deploy to (default: ord)-m, --memory <memory>
: Memory allocation (e.g., 256mb, 1gb) (default: 1gb)-c, --cpus <cpus>
: Number of CPUs (default: 1)--skip-build
: Skip the build step--remote
: Use remote Docker build
Important Warnings
- Data Loss: Your deployed applications and data may be lost at any time
- Breaking Changes: The deployment architecture will change soon
- No SLA: This is an experimental service with no uptime guarantees
- Access Required: You must have a valid deploy code from Tonk
How It Works
- Automatically builds your Tonk application with the correct configuration
- Creates a compressed bundle of your project
- Uploads the bundle to Tonk's hosting service
- Starts the application with route-based serving
- Returns a public URL for your deployed application
Use tonk push
for local deployment or tonk deploy
for cloud hosting.
Getting Access
Contact Jack at Tonk (Telegram) to request a deploy code.
Next Steps
- Explore Tonk Workers for background processing
Reference
This reference guide provides detailed information about Tonk commands, features, and troubleshooting tips.
Command reference
The Tonk CLI includes the following commands:
tonk hello
Initializes the Tonk daemon, which provides synchronization services for your apps.
Usage: tonk hello [options]
Say hello to start and launch the tonk daemon
Options:
-h, --help display help for command
tonk create
Creates a new Tonk application with an interactive setup process.
Usage: tonk create [options]
Create a new tonk application or component
Options:
-i, --init initialize in the folder
-h, --help display help for command
tonk push
Builds, packages, uploads, and starts your application bundle on the Tonk server. This is the primary deployment command that handles the complete workflow by default.
Usage: tonk push [options]
Package, upload, build and start a bundle on the Tonk server
Options:
-u, --url <url> URL of the Tonk server (default: "http://localhost:7777")
-n, --name <name> Name for the bundle (defaults to directory name)
-d, --dir <dir> Directory to bundle (defaults to ./dist)
-r, --route <route> Route path for the bundle (defaults to /bundleName)
--no-build Skip building the project before pushing
--no-start Skip starting the bundle after upload
-h, --help display help for command
tonk ls
Lists available application bundles on the Tonk server.
Usage: tonk ls [options]
List available bundles on the Tonk server
Options:
-u, --url <url> URL of the Tonk server (default: "http://localhost:7777")
-h, --help display help for command
tonk ps
Shows currently running bundle servers.
Usage: tonk ps [options]
List running bundle servers
Options:
-u, --url <url> URL of the Tonk server (default: "http://localhost:7777")
-h, --help display help for command
tonk start <bundle-name>
Starts a bundle server for a specific bundle.
Usage: tonk start [options] <bundleName>
Start a bundle server
Arguments:
bundleName Name of the bundle to start
Options:
-u, --url <url> URL of the Tonk server (default: "http://localhost:7777")
-p, --port <port> Port for the bundle server (optional)
-h, --help display help for command
tonk kill <server-id>
Stops a running bundle server.
Usage: tonk kill [options] <serverId>
Stop a running bundle server
Arguments:
serverId ID of the server to stop
Options:
-u, --url <url> URL of the Tonk server (default: "http://localhost:7777")
-h, --help display help for command
tonk proxy <bundle-name>
Creates a reverse proxy to access a Tonk bundle using SSH tunnelling with Pinggy service.
Usage: tonk proxy [options] <bundleName>
Create a reverse proxy to access a Tonk bundle
Arguments:
bundleName Name of the bundle to proxy
Options:
-u, --url <url> URL of the Tonk server (default: "http://localhost:7777")
-h, --help display help for command
This command checks if the specified bundle is running, then creates an SSH tunnel using Pinggy to make the bundle accessible via a public URL with QR code for easy mobile access.
tonk worker
Manages worker processes and configurations. The worker command provides comprehensive lifecycle management for Tonk workers.
Usage: tonk worker [command] [options]
Manage Tonk workers
Commands:
inspect <nameOrId> Inspect a specific worker
ls List all registered workers
rm <nameOrId> Remove a registered worker
ping <nameOrId> Ping a worker to check its status
start <nameOrId> Start a worker
stop <nameOrId> Stop a worker
logs <nameOrId> View logs for a worker
register [dir] Register a worker with Tonk
install <package> Install and start a worker from npm
init Initialize a new worker configuration file
Options:
-h, --help display help for command
Worker Subcommands
tonk worker inspect <nameOrId>
Inspect a specific worker and optionally perform actions on it.
Options:
-s, --start Start the worker
-S, --stop Stop the worker
-c, --config <path> Path to worker configuration file
-p, --ping Ping the worker to check its status
-h, --help display help for command
tonk worker logs <nameOrId>
View logs for a worker using PM2 integration.
Options:
-f, --follow Follow log output
-l, --lines <n> Number of lines to show (default: "100")
-e, --error Show only error logs
-o, --out Show only standard output logs
-h, --help display help for command
tonk worker register [dir]
Register a worker with Tonk from a directory containing worker configuration.
Arguments:
dir Path to worker directory (defaults to current directory)
Options:
-n, --name <n> Name of the worker
-e, --endpoint <endpoint> Endpoint URL of the worker
-p, --port <port> Port number for the worker
-d, --description <description> Description of the worker
-h, --help display help for command
tonk worker install <package>
Install and start a worker directly from an npm package.
Arguments:
package NPM package name
Options:
-p, --port <port> Specify a port for the worker (default: auto-detect)
-n, --name <n> Custom name for the worker (default: npm package name)
-h, --help display help for command
tonk worker init
Initialize a new worker configuration file in the current or specified directory.
Options:
-d, --dir <directory> Directory to create the configuration file in (default: ".")
-n, --name <n> Name of the worker
-p, --port <port> Port number for the worker (default: "5555")
-h, --help display help for command
FAQ
Pre-requisites to install
- You'll need Node.js and npm installed to run the Tonk installation command.
How do I get it working on Windows?
Tonk should work on Windows without any extra configuration.
- Install Tonk via npm:
npm install -g @tonk/cli
- Start Tonk:
tonk hello
LLM Instructions
This section contains all the instructions for AI coding assistants (LLMs) working with the Tonk codebase. The instructions are organized into shared common patterns and template-specific variations.
Organization
Shared Instructions
Common patterns and guidelines used across multiple templates:
- Keepsync - Data synchronization patterns
- React/Browser - Browser-based keepsync usage
- Worker/Node.js - Node.js-based keepsync usage
- Examples - Code examples for both environments
- Components - Component creation guidelines
- Stores - State management patterns
- Views - View creation guidelines
- Server - Server endpoint patterns
- Instructions - General instruction patterns
Template-Specific Instructions
Variations and specifics for different template types:
- React - React application templates
- Worker - Background worker templates
- Workspace - Full workspace templates
Usage
These instructions serve as the single source of truth for LLM guidance. They are automatically distributed to the appropriate template locations using the distribute-from-docs.js
utility.
Maintenance
When updating LLM instructions:
- Edit the appropriate file in this
docs/src/llms/
directory - Run
node utils/distribute-from-docs.js
to distribute changes - Changes will be automatically synced to all template locations
This ensures consistency across all AI coding tools (Claude, Cursor, Windsurf) and template types.
Keepsync Instructions
Keepsync is Tonk's local-first sync engine that provides real-time collaboration and data synchronization. This section contains environment-specific instructions for working with keepsync.
Environment-Specific Instructions
React/Browser Environment
- React/Browser Instructions - Complete guide for using keepsync in React applications
- IndexedDB storage for browser persistence
- WebSocket connections for real-time sync
- React hooks and Zustand store integration
Worker/Node.js Environment
- Worker/Node.js Instructions - Complete guide for using keepsync in Node.js workers
- Node.js filesystem storage
- Server-side document operations
- Background processing patterns
Code Examples
React Examples
- React Examples - Complete todo application with collaborative features
- Zustand store with sync middleware
- React components using synced state
- Real-time collaboration patterns
Worker Examples
- Worker Examples - API data fetching and processing
- External API integration
- Document storage and retrieval
- Background processing workflows
Key Concepts
Core Features
- Real-time Synchronization: Changes are instantly propagated across all clients
- Offline-first: Applications work without internet, sync when reconnected
- Conflict Resolution: Automatic conflict resolution using Automerge CRDTs
- Path-based Storage: Filesystem-like document organization
Common Patterns
- Synced Stores: Zustand stores with automatic synchronization
- Document Operations: Direct document reading and writing
- File System Operations: Directory and document management
- Error Handling: Graceful handling of network and initialization failures
Usage Guidelines
- Choose the Right Environment: Use React/Browser instructions for frontend apps, Worker/Node.js for backend services
- Follow Path Conventions: Use clear, hierarchical paths like
users/profiles/john
- Handle Initialization: Always provide error handling for sync engine initialization
- Use Meaningful Document IDs: Make document IDs descriptive and unique
- Clean Up Listeners: Remove listeners when components unmount or are no longer needed
API Reference
Both environments provide the same core API with environment-specific adapters:
configureSyncEngine()
- Initialize the sync enginereadDoc()
/writeDoc()
- Document operationssync()
middleware - Zustand store synchronizationlistenToDoc()
- Real-time document listeningls()
/mkDir()
/rm()
- Filesystem operations
Basic Usage
1. Set Up the Sync Provider
Initialize the sync engine in your application entry point (or before using any synced stores):
// index.tsx
import { configureSyncEngine } from '@tonk/keepsync';
import { BrowserWebSocketClientAdapter } from '@automerge/automerge-repo-network-websocket';
import { IndexedDBStorageAdapter } from '@automerge/automerge-repo-storage-indexeddb';
const wsProtocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
const wsUrl = `${wsProtocol}//${window.location.host}/sync`;
const wsAdapter = new BrowserWebSocketClientAdapter(wsUrl);
const storage = new IndexedDBStorageAdapter();
const url =
window.location.host.indexOf('localhost') === 0
? 'http://localhost:7777'
: `${window.location.protocol}//${window.location.host}`;
await configureSyncEngine({
url,
network: [wsAdapter as any],
storage,
});
2. Create a Synced Store with the Middleware
Use the sync
middleware to create stores that automatically synchronize with other clients:
// stores/counterStore.ts
import { create } from 'zustand';
import { sync, DocumentId } from '@tonk/keepsync';
interface CounterState {
count: number;
increment: () => void;
decrement: () => void;
reset: () => void;
}
export const useCounterStore = create<CounterState>(
sync(
// The store implementation
set => ({
count: 0,
// Increment the counter
increment: () => {
set(state => ({ count: state.count + 1 }));
},
// Decrement the counter
decrement: () => {
set(state => ({ count: Math.max(0, state.count - 1) }));
},
// Reset the counter
reset: () => {
set({ count: 0 });
},
}),
// Sync configuration
{
docId: 'counter' as DocumentId,
// Optional: configure initialization timeout
initTimeout: 30000,
// Optional: handle initialization errors
onInitError: error => console.error('Sync initialization error:', error),
}
)
);
3. Use the Store in React Components
// components/Counter.tsx
import React from 'react';
import { useCounterStore } from '../stores/counterStore';
export function Counter() {
// Use the store hook directly - sync is handled by the middleware
const { count, increment, decrement, reset } = useCounterStore();
return (
<div>
<h2>Collaborative Counter: {count}</h2>
<div>
<button onClick={decrement}>-</button>
<button onClick={increment}>+</button>
<button onClick={reset}>Reset</button>
</div>
<p>
<small>
Open this app in multiple windows to see real-time collaboration in action.
</small>
</p>
</div>
);
}
Directly reading and writing documents
You can also directly read and write documents and address them using paths similar to a filesystem. This is useful for when you need more fine-grained control over document access and a zustand store is too cumbersome (e.g. when you want each document to have its own space and be directly addressable);
import { readDoc, writeDoc, ls, mkDir, rm, listenToDoc } from "@tonk/keepsync";
/**
* Reads a document from keepsync
*
* This function retrieves a document at the specified path in your sync engine.
* It returns the document content if found, or undefined if the document doesn't exist.
*
* @param path - The path identifying the document to read
* @returns Promise resolving to the document content or undefined if not found
* @throws Error if the SyncEngine is not properly initialized
*/
readDoc = async <T>(path: string): Promise<T | undefined>;
/**
* Writes content to a document to keepsync
*
* This function creates or updates a document at the specified path.
* If the document doesn't exist, it creates a new one.
* If the document already exists, it updates it with the provided content.
*
* @param path - The path identifying the document to write
* @param content - The content to write to the document
* @throws Error if the SyncEngine is not properly initialized
*/
writeDoc = async <T>(path: string, content: T);
/**
* Lists documents at a specified path
*
* This function retrieves a list of documents at the specified directory path.
* It returns an array of document names found at that path.
*
* @param path - The directory path to list documents from
* @returns Promise resolving to an array of document names
* @throws Error if the SyncEngine is not properly initialized
*/
ls = async (path: string): Promise<string[]>;
/**
* Creates a directory at the specified path
*
* This function creates a new directory at the specified path.
* If the directory already exists, it does nothing.
*
* @param path - The path where the directory should be created
* @throws Error if the SyncEngine is not properly initialized
*/
mkDir = async (path: string): Promise<void>;
/**
* Removes a document or directory at the specified path
*
* This function deletes a document or directory at the specified path.
* If removing a directory, it will remove all documents within it.
*
* @param path - The path of the document or directory to remove
* @param recursive - Whether to recursively remove directories (default: false)
* @throws Error if the SyncEngine is not properly initialized
*/
rm = async (path: string, recursive?: boolean): Promise<void>;
/**
* Listens for changes to a document
*
* This function sets up a listener for changes to a document at the specified path.
* The callback will be called whenever the document changes with detailed patch information.
*
* @param path - The path of the document to listen to
* @param callback - Function to call when the document changes, receives payload with doc, patches, patchInfo, and handle
* @returns A function that can be called to stop listening
* @throws Error if the SyncEngine is not properly initialized
*/
listenToDoc = <T>(path: string, callback: (payload: { doc: T; patches: any[]; patchInfo: any; handle: DocHandle<T> }) => void): Promise<() => void>;
File System Operations Example
Here's an example of how to use the file system operations:
import { ls, mkDir, rm, readDoc, writeDoc, listenToDoc } from '@tonk/keepsync';
// Create a directory structure
await mkDir('/users');
// Write a document
await writeDoc('/users/user1', { name: 'Alice', age: 30 });
await writeDoc('/users/user2', { name: 'Bob', age: 25 });
// List documents in a directory
const users = await ls('/users');
console.log(users); // ["user1", "user2"]
// Read a document
const user1 = await readDoc<{ name: string; age: number }>('/users/user1');
console.log(user1); // { name: "Alice", age: 30 }
// Listen for changes to a document
const unsubscribe = await listenToDoc<{ name: string; age: number }>('/users/user1', payload => {
const { doc: user, patches, patchInfo, handle } = payload;
if (user) {
console.log(`User updated: ${user.name}, ${user.age}`);
console.log('Patches:', patches);
console.log('Patch info:', patchInfo);
}
});
// Update the document (will trigger the listener)
await writeDoc('/users/user1', { name: 'Alice', age: 31 });
// Stop listening when done
unsubscribe();
// Remove a document
await rm('/users/user2');
// Remove a directory and all its contents
await rm('/users', true);
DO NOT EDIT - AUTO-GENERATED FROM docs/src/llms/shared/keepsync/worker-nodejs.md
This file is automatically generated from the documentation.
Edit the source file instead: docs/src/llms/shared/keepsync/worker-nodejs.md
Generated on: 2025-07-17T13:11:10.053Z
DO NOT EDIT - AUTO-GENERATED FROM docs/src/llms/shared/keepsync/worker-nodejs.md
This file is automatically generated from the documentation.
Edit the source file instead: docs/src/llms/shared/keepsync/worker-nodejs.md
Generated on: 2025-07-17T13:11:10.054Z
Basic Usage
1. Set Up the Sync Provider
Initialize the sync engine in your application entry point (or before using any synced stores):
// index.tsx
import { configureSyncEngine } from "@tonk/keepsync";
import { BrowserWebSocketClientAdapter } from "@automerge/automerge-repo-network-websocket";
import { NodeFSStorageAdapter } from "@automerge/automerge-repo-storage-nodefs";
const wsAdapter = new BrowserWebSocketClientAdapter("ws://localhost:7777/sync);
const storage = new NodeFSStorageAdapter();
await configureSyncEngine({
url: "http://localhost:7777",
network: [wsAdapter as any],
storage,
});
2. Create a Synced Store with the Middleware
Use the sync
middleware to create stores that automatically synchronize with other clients:
// stores/counterStore.ts
import { createStore } from 'zustand/vanilla';
import { sync, DocumentId } from '@tonk/keepsync';
interface CounterState {
count: number;
increment: () => void;
decrement: () => void;
reset: () => void;
}
export const counterStore = createStore<CounterState>(
sync(
// The store implementation
set => ({
count: 0,
// Increment the counter
increment: () => {
set(state => ({ count: state.count + 1 }));
},
// Decrement the counter
decrement: () => {
set(state => ({ count: Math.max(0, state.count - 1) }));
},
// Reset the counter
reset: () => {
set({ count: 0 });
},
}),
// Sync configuration
{
docId: 'counter' as DocumentId,
// Optional: configure initialization timeout
initTimeout: 30000,
// Optional: handle initialization errors
onInitError: error => console.error('Sync initialization error:', error),
}
)
);
3. Manually fetch the state
Because this is a Node project, we need to use zustand in a different way as it is used in React
components. Each time you want fresh state you will need to use the getState()
function.
const counterStore = createStore<CounterState>(
sync(
// The store implementation
set => ({
count: 0,
// Increment the counter
increment: () => {
set(state => ({ count: state.count + 1 }));
},
// Decrement the counter
decrement: () => {
set(state => ({ count: Math.max(0, state.count - 1) }));
},
// Reset the counter
reset: () => {
set({ count: 0 });
},
}),
// Sync configuration
{
docId: 'counter' as DocumentId,
// Optional: configure initialization timeout
initTimeout: 30000,
// Optional: handle initialization errors
onInitError: error => console.error('Sync initialization error:', error),
}
)
);
const state = counterStore.getState();
state.increment();
console.log(`The current count is: ${store.getState().count}`);
Directly reading and writing documents
You can also directly read and write documents and address them using paths similar to a filesystem. This is useful for when you need more fine-grained control over document access and a zustand store is too cumbersome (e.g. when you want each document to have its own space and be directly addressable);
import { readDoc, writeDoc, ls, mkDir, rm, listenToDoc } from "@tonk/keepsync";
/**
* Reads a document from keepsync
*
* This function retrieves a document at the specified path in your sync engine.
* It returns the document content if found, or undefined if the document doesn't exist.
*
* @param path - The path identifying the document to read
* @returns Promise resolving to the document content or undefined if not found
* @throws Error if the SyncEngine is not properly initialized
*/
readDoc = async <T>(path: string): Promise<T | undefined>;
/**
* Writes content to a document to keepsync
*
* This function creates or updates a document at the specified path.
* If the document doesn't exist, it creates a new one.
* If the document already exists, it updates it with the provided content.
*
* @param path - The path identifying the document to write
* @param content - The content to write to the document
* @throws Error if the SyncEngine is not properly initialized
*/
writeDoc = async <T>(path: string, content: T);
/**
* Lists documents at a specified path
*
* This function retrieves a list of documents at the specified directory path.
* It returns an array of document names found at that path.
*
* @param path - The directory path to list documents from
* @returns Promise resolving to an array of document names
* @throws Error if the SyncEngine is not properly initialized
*/
ls = async (path: string): Promise<string[]>;
/**
* Creates a directory at the specified path
*
* This function creates a new directory at the specified path.
* If the directory already exists, it does nothing.
*
* @param path - The path where the directory should be created
* @throws Error if the SyncEngine is not properly initialized
*/
mkDir = async (path: string): Promise<void>;
/**
* Removes a document or directory at the specified path
*
* This function deletes a document or directory at the specified path.
* If removing a directory, it will remove all documents within it.
*
* @param path - The path of the document or directory to remove
* @param recursive - Whether to recursively remove directories (default: false)
* @throws Error if the SyncEngine is not properly initialized
*/
rm = async (path: string, recursive?: boolean): Promise<void>;
/**
* Listens for changes to a document
*
* This function sets up a listener for changes to a document at the specified path.
* The callback will be called whenever the document changes with detailed patch information.
*
* @param path - The path of the document to listen to
* @param callback - Function to call when the document changes, receives payload with doc, patches, patchInfo, and handle
* @returns A function that can be called to stop listening
* @throws Error if the SyncEngine is not properly initialized
*/
listenToDoc = <T>(path: string, callback: (payload: { doc: T; patches: any[]; patchInfo: any; handle: DocHandle<T> }) => void): Promise<() => void>;
File System Operations Example
Here's an example of how to use the file system operations:
import { ls, mkDir, rm, readDoc, writeDoc, listenToDoc } from '@tonk/keepsync';
// Create a directory structure
await mkDir('/users');
// Write a document
await writeDoc('/users/user1', { name: 'Alice', age: 30 });
await writeDoc('/users/user2', { name: 'Bob', age: 25 });
// List documents in a directory
const users = await ls('/users');
console.log(users); // ["user1", "user2"]
// Read a document
const user1 = await readDoc<{ name: string; age: number }>('/users/user1');
console.log(user1); // { name: "Alice", age: 30 }
// Listen for changes to a document
const unsubscribe = await listenToDoc<{ name: string; age: number }>('/users/user1', payload => {
const { doc: user, patches, patchInfo, handle } = payload;
if (user) {
console.log(`User updated: ${user.name}, ${user.age}`);
console.log('Patches:', patches);
console.log('Patch info:', patchInfo);
}
});
// Update the document (will trigger the listener)
await writeDoc('/users/user1', { name: 'Alice', age: 31 });
// Stop listening when done
unsubscribe();
// Remove a document
await rm('/users/user2');
// Remove a directory and all its contents
await rm('/users', true);
React Keepsync Examples
This section shows complete examples of how to use keepsync in React applications. These examples demonstrate a collaborative todo application with real-time synchronization.
Todo Store
A complete Zustand store with keepsync synchronization:
import { sync, DocumentId } from "@tonk/keepsync";
import { create } from "zustand";
// Define the Todo type
export interface Todo {
id: string;
text: string;
completed: boolean;
}
// Define the data structure
interface TodoData {
todos: Todo[];
}
// Define the store state
interface TodoState extends TodoData {
addTodo: (text: string) => void;
toggleTodo: (id: string) => void;
deleteTodo: (id: string) => void;
}
// Create a synced store for todos
export const useTodoStore = create<TodoState>(
sync(
(set) => ({
todos: [],
// Add a new todo
addTodo: (text: string) => {
set((state) => ({
todos: [
...state.todos,
{
id: crypto.randomUUID(),
text,
completed: false,
},
],
}));
},
// Toggle a todo's completed status
toggleTodo: (id: string) => {
set((state) => ({
todos: state.todos.map((todo) =>
todo.id === id ? { ...todo, completed: !todo.completed } : todo
),
}));
},
// Delete a todo
deleteTodo: (id: string) => {
set((state) => ({
todos: state.todos.filter((todo) => todo.id !== id),
}));
},
}),
{
// Unique document ID for this store
docId: "todo-list" as DocumentId,
}
)
);
Todo List Component
A React component that displays and manages todos:
import React from "react";
import { Todo, useTodoStore } from "../stores/todoStore";
const TodoList = () => {
const { todos, deleteTodo, toggleTodo } = useTodoStore();
if (todos.length === 0) {
return <p>No todos yet. Add one above!</p>;
}
return (
<ul className="todo-list">
{todos.map((todo: Todo) => (
<li
key={todo.id}
className={`todo-item ${todo.completed ? "completed" : ""}`}
>
<input
type="checkbox"
checked={todo.completed}
onChange={() => toggleTodo(todo.id)}
/>
<span className="todo-text">{todo.text}</span>
<button className="delete-btn" onClick={() => deleteTodo(todo.id)}>
Delete
</button>
</li>
))}
</ul>
);
};
export default TodoList;
Add Todo Component
A React component for adding new todos:
import React from "react";
import { useState } from "react";
import { useTodoStore } from "../stores/todoStore";
const AddTodo = () => {
const [text, setText] = useState("");
const { addTodo } = useTodoStore();
const handleSubmit = (e: React.FormEvent) => {
e.preventDefault();
if (text.trim()) {
addTodo(text.trim());
setText("");
}
};
return (
<form onSubmit={handleSubmit} className="add-todo">
<input
type="text"
value={text}
onChange={(e) => setText(e.target.value)}
placeholder="What needs to be done?"
/>
<button type="submit">Add Todo</button>
</form>
);
};
export default AddTodo;
Main App Component
The main application component that brings everything together:
import React from "react";
import { useState } from "react";
import TodoList from "./components/TodoList";
import AddTodo from "./components/AddTodo";
function App() {
const [connectionStatus, setConnectionStatus] = useState<
"connected" | "disconnected"
>("connected");
// Handle disconnecting from the sync engine
const handleDisconnect = () => {
setConnectionStatus("disconnected");
};
// Handle reconnecting to the sync engine
const handleReconnect = () => {
// This will reinitialize the sync engine on the next store access
setConnectionStatus("connected");
};
return (
<div className="container">
<h1>Collaborative Todo List</h1>
<p>
Status:{" "}
<span
style={{ color: connectionStatus === "connected" ? "green" : "red" }}
>
{connectionStatus}
</span>
{connectionStatus === "connected" ? (
<button onClick={handleDisconnect} style={{ marginLeft: "1rem" }}>
Disconnect
</button>
) : (
<button onClick={handleReconnect} style={{ marginLeft: "1rem" }}>
Reconnect
</button>
)}
</p>
<AddTodo />
<TodoList />
<p>
<small>
Changes are automatically synced across all connected clients.
<br />
Open this app in multiple windows to see real-time collaboration in
action.
</small>
</p>
</div>
);
}
export default App;
Key Concepts Demonstrated
- Sync Middleware: Using
sync()
to create a collaborative Zustand store - Document ID: Using
docId
to identify the shared document - Real-time Updates: Changes are automatically synchronized across all clients
- Component Integration: React components seamlessly use the synced store
- Connection Management: Handling connection status and reconnection scenarios
Running the Example
To run this example:
- Create a new Tonk React app:
tonk create
(choose React) - Replace the generated files with the code above
- Start the development server:
pnpm dev
- Open multiple browser windows to see real-time collaboration
The todos will be synchronized in real-time across all connected clients!
Worker Keepsync Examples
This section shows examples of how to use keepsync in background workers. These examples demonstrate data fetching, processing, and storage patterns for Node.js environments.
API Data Fetching Worker
A complete example of fetching data from an external API and storing it in keepsync:
{{#include ../../../../../../packages/create/templates/worker/instructions/keepsync/examples/worker/index.ts}}
Key Concepts Demonstrated
- Data Fetching: Fetching data from external APIs
- Document Operations: Using
readDoc()
andwriteDoc()
for data storage - Data Merging: Combining existing data with new data
- Error Handling: Proper error handling in worker contexts
- Path-based Storage: Using filesystem-like paths for data organization
Worker Patterns
Workers typically follow these patterns:
- Scheduled Data Fetching: Periodically fetch data from external sources
- Document Storage: Store processed data in keepsync documents
- Data Transformation: Process raw data into structured formats
- Error Recovery: Handle API failures and network issues gracefully
Running the Example
To run this example:
- Create a new Tonk worker:
tonk create
(choose Worker) - Replace the generated code with the example above
- Configure your API endpoints and credentials
- Start the worker:
pnpm dev
The worker will fetch data and store it in keepsync, making it available to all connected applications!
How to create components
Components should be...
- Pure and have only internal UI state. All other state is external.
- Always use inline styling with tailwind
- Each component should have its own folder, with
index.tsx
as the entry point - Export typed definitions of props
Examples
Task Card Component
import React, { useState } from 'react';
import { CheckCircle, Clock, AlertCircle } from 'lucide-react';
/**
* Represents the priority level of a task
* @readonly
*/
export type TaskPriority = 'low' | 'medium' | 'high';
/**
* Represents the current status of a task
* @readonly
*/
export type TaskStatus = 'pending' | 'in-progress' | 'completed';
/**
* Configuration options for the TaskCard component
* @interface
*/
export interface TaskCardProps {
/**
* The unique identifier for the task
*/
id: string;
/**
* The main title/description of the task
*/
title: string;
/**
* The priority level of the task
* @default 'medium'
*/
priority?: TaskPriority;
/**
* The initial status of the task
* @default 'pending'
*/
initialStatus?: TaskStatus;
/**
* Optional due date for the task
*/
dueDate?: Date;
/**
* Callback function triggered when task status changes
* @param newStatus - The new status value
* @param taskId - The ID of the task that changed
*/
onStatusChange?: (newStatus: TaskStatus, taskId: string) => void;
}
/**
* A rich task card component that displays task information and allows status updates
*
* @description
* TaskCard is a self-contained component that manages its own state while still (ONLY UI STATE, all other state should be external)
* allowing parent components to track status changes. It includes visual indicators
* for priority levels and status, with a clean, accessible design.
*
* @example
* <TaskCard
* id="task-1"
* title="Complete project documentation"
* priority="high"
* dueDate={new Date('2025-03-01')}
* onStatusChange={(status, id) => console.log(`Task ${id} changed to ${status}`)}
* />
*/
const TaskCard: React.FC<TaskCardProps> = ({
id,
title,
priority = 'medium',
initialStatus = 'pending',
dueDate,
onStatusChange
}) => {
const [status, setStatus] = useState<TaskStatus>(initialStatus);
// Map priority levels to Tailwind classes
const priorityClasses = {
low: 'bg-blue-100 text-blue-800',
medium: 'bg-yellow-100 text-yellow-800',
high: 'bg-red-100 text-red-800'
};
// Map status to icons
const statusIcons = {
pending: <Clock className="w-5 h-5 text-gray-500" />,
'in-progress': <AlertCircle className="w-5 h-5 text-yellow-500" />,
completed: <CheckCircle className="w-5 h-5 text-green-500" />
};
/**
* Handles status updates and triggers the callback if provided
*/
const handleStatusChange = (newStatus: TaskStatus) => {
setStatus(newStatus);
onStatusChange?.(newStatus, id);
};
return (
<div className="rounded-lg border border-gray-200 p-4 shadow-sm hover:shadow-md transition-shadow">
<div className="flex items-start justify-between">
<div className="flex-1">
<h3 className="text-lg font-semibold text-gray-900">{title}</h3>
<div className="mt-2 flex items-center gap-2">
<span className={`inline-flex items-center px-2.5 py-0.5 rounded-full text-xs font-medium ${priorityClasses[priority]}`}>
{priority}
</span>
{dueDate && (
<span className="text-sm text-gray-500">
Due: {dueDate.toLocaleDateString()}
</span>
)}
</div>
</div>
<div className="flex items-center gap-2">
{statusIcons[status]}
<select
value={status}
onChange={(e) => handleStatusChange(e.target.value as TaskStatus)}
className="block rounded-md border-gray-300 text-sm focus:border-blue-500 focus:ring-blue-500"
>
<option value="pending">Pending</option>
<option value="in-progress">In Progress</option>
<option value="completed">Completed</option>
</select>
</div>
</div>
</div>
);
};
export default TaskCard;
Counter Component
import React, { useState } from 'react';
import { Plus, Minus } from 'lucide-react';
/**
* Props for the CounterButton component
*/
export interface CounterButtonProps {
/** Initial value for the counter */
initialValue?: number;
/** Maximum allowed value */
max?: number;
/** Minimum allowed value */
min?: number;
}
/**
* A simple counter button that allows incrementing and decrementing a value
* within specified bounds.
*/
const CounterButton: React.FC<CounterButtonProps> = ({
initialValue = 0,
max = 10,
min = 0
}) => {
const [count, setCount] = useState(initialValue);
const increment = () => {
if (count < max) {
setCount(count + 1);
}
};
const decrement = () => {
if (count > min) {
setCount(count - 1);
}
};
return (
<div className="flex items-center gap-3 rounded-lg border border-gray-200 p-2 w-40">
<button
onClick={decrement}
disabled={count <= min}
className="p-1 text-gray-600 hover:text-gray-900 disabled:opacity-50"
>
<Minus size={20} />
</button>
<span className="flex-1 text-center font-medium">
{count}
</span>
<button
onClick={increment}
disabled={count >= max}
className="p-1 text-gray-600 hover:text-gray-900 disabled:opacity-50"
>
<Plus size={20} />
</button>
</div>
);
};
export default CounterButton;
How to manage state and contexts
- Use explicit interfaces to define both state and actions
- Keep stores focused on a single need
- Each store should handle one specific piece of functionality
- Don't try to manage multiple concerns at once unless the state is shared between multiple components
- Do not over-engineer solutions to a specific problem, prefer simplicity and elegance
- Group actions near the state properties they modify to make their relationships clear
- Use semantic action names that describe the state change, like
startEditing
ortoggleComplete
overupdate
andset
- Split state into logical groups (
ui
,validation
,data
) to make the structure intuitive and maintainable - Make state updates atomic and predictable
- Each action should only update the state it needs to, using immutable patterns
- Include TypeScript interfaces and JSDoc comments that explain the purpose of each part of the store
When to use a store
- Stores hold data that will be accessed by any user of the app
- All clients have access to the data in stores
- Any data which would never need to be accessed outside of an immediate local context should not be held in a store, should not use the keepsync framework, and should not be placed inside the store directory
Examples
Counter Store - Basic Pattern
A minimalist store demonstrating best practices for component-specific state management:
State Interface Definition
/**
* Core state interface defining all properties that can be read
* Separates data (count) from UI concerns (animation state)
*/
interface CounterState {
/** The current numeric value of the counter */
count: number
/** UI-specific state properties */
ui: {
/** Tracks whether the counter is currently animating */
isAnimating: boolean
}
}
/**
* Actions interface defining all ways the state can be modified
* Each action represents a single, specific way to update the state
*/
interface CounterActions {
/** Increases the counter by 1 and triggers animation */
increment: () => void
/** Decreases the counter by 1 and triggers animation */
decrement: () => void
/** Resets the counter to 0 */
reset: () => void
/** Sets a specific value */
setValue: (value: number) => void
}
Store Implementation
/**
* Combined interface for the complete store
*/
type CounterStore = CounterState & CounterActions;
/**
* Create the counter store with Zustand
*/
export const useCounterStore = create<CounterStore>((set, get) => ({
// Initial state
count: 0,
ui: {
isAnimating: false
},
// Actions grouped logically
increment: () => {
set((state) => ({
count: state.count + 1,
ui: { ...state.ui, isAnimating: true }
}));
// Reset animation after delay
setTimeout(() => {
set((state) => ({
...state,
ui: { ...state.ui, isAnimating: false }
}));
}, 200);
},
decrement: () => {
set((state) => ({
count: Math.max(0, state.count - 1),
ui: { ...state.ui, isAnimating: true }
}));
setTimeout(() => {
set((state) => ({
...state,
ui: { ...state.ui, isAnimating: false }
}));
}, 200);
},
reset: () => {
set({ count: 0, ui: { isAnimating: false } });
},
setValue: (value: number) => {
if (value < 0) return; // Validation
set((state) => ({
count: value,
ui: { ...state.ui, isAnimating: false }
}));
}
}));
Component Usage
import { useCounterStore } from '../stores/counterStore';
export const Counter = () => {
// Extract only needed state and actions
const { count, ui, increment, decrement, reset } = useCounterStore();
return (
<div className={`transition-transform ${ui.isAnimating ? 'scale-105' : 'scale-100'}`}>
<span className="text-2xl font-bold">{count}</span>
<div className="space-x-2">
<button onClick={decrement}>-</button>
<button onClick={increment}>+</button>
<button onClick={reset}>Reset</button>
</div>
</div>
);
};
Todo Store - Complex State Management
A more complex example demonstrating validation, multiple data types, and error handling:
State and Types Definition
/**
* Individual todo item interface
*/
interface TodoItem {
id: string;
text: string;
completed: boolean;
createdAt: Date;
completedAt?: Date;
priority: 'low' | 'medium' | 'high';
}
/**
* Main state interface for the todo store
*/
interface TodoState {
/** Array of all todo items */
todos: TodoItem[];
/** Currently selected filter */
filter: 'all' | 'active' | 'completed';
/** Form state for adding new todos */
form: {
text: string;
priority: TodoItem['priority'];
isValid: boolean;
};
/** UI state */
ui: {
isLoading: boolean;
error: string | null;
};
}
/**
* Actions interface for todo management
*/
interface TodoActions {
// Todo CRUD operations
addTodo: (text: string, priority?: TodoItem['priority']) => void;
toggleTodo: (id: string) => void;
deleteTodo: (id: string) => void;
editTodo: (id: string, text: string) => void;
// Filter operations
setFilter: (filter: TodoState['filter']) => void;
// Form operations
updateFormText: (text: string) => void;
updateFormPriority: (priority: TodoItem['priority']) => void;
resetForm: () => void;
// Bulk operations
clearCompleted: () => void;
toggleAll: () => void;
}
Store Implementation
type TodoStore = TodoState & TodoActions;
export const useTodoStore = create<TodoStore>((set, get) => ({
// Initial state
todos: [],
filter: 'all',
form: {
text: '',
priority: 'medium',
isValid: false
},
ui: {
isLoading: false,
error: null
},
// Actions
addTodo: (text: string, priority = 'medium') => {
const trimmedText = text.trim();
if (!trimmedText) {
set((state) => ({
...state,
ui: { ...state.ui, error: 'Todo text cannot be empty' }
}));
return;
}
const newTodo: TodoItem = {
id: crypto.randomUUID(),
text: trimmedText,
completed: false,
createdAt: new Date(),
priority
};
set((state) => ({
todos: [...state.todos, newTodo],
form: { text: '', priority: 'medium', isValid: false },
ui: { ...state.ui, error: null }
}));
},
toggleTodo: (id: string) => {
set((state) => ({
todos: state.todos.map(todo =>
todo.id === id
? {
...todo,
completed: !todo.completed,
completedAt: !todo.completed ? new Date() : undefined
}
: todo
)
}));
},
deleteTodo: (id: string) => {
set((state) => ({
todos: state.todos.filter(todo => todo.id !== id)
}));
},
setFilter: (filter) => {
set((state) => ({ ...state, filter }));
},
updateFormText: (text: string) => {
set((state) => ({
form: {
...state.form,
text,
isValid: text.trim().length > 0
}
}));
},
clearCompleted: () => {
set((state) => ({
todos: state.todos.filter(todo => !todo.completed)
}));
}
}));
Advanced Usage with Selectors
// Custom selectors for computed values
export const useTodoSelectors = () => {
const { todos, filter } = useTodoStore();
return useMemo(() => {
const filteredTodos = todos.filter(todo => {
switch (filter) {
case 'active': return !todo.completed;
case 'completed': return todo.completed;
default: return true;
}
});
return {
filteredTodos,
activeTodosCount: todos.filter(t => !t.completed).length,
completedTodosCount: todos.filter(t => t.completed).length,
totalTodosCount: todos.length
};
}, [todos, filter]);
};
// Usage in component
export const TodoList = () => {
const { addTodo, toggleTodo, deleteTodo } = useTodoStore();
const { filteredTodos, activeTodosCount } = useTodoSelectors();
return (
<div>
<p>{activeTodosCount} items left</p>
{filteredTodos.map(todo => (
<TodoItem
key={todo.id}
todo={todo}
onToggle={() => toggleTodo(todo.id)}
onDelete={() => deleteTodo(todo.id)}
/>
))}
</div>
);
};
Best Practices Summary
1. State Structure
- Separate concerns: Keep UI state separate from data state
- Use meaningful names: State properties should be self-documenting
- Type everything: Use TypeScript interfaces for all state and actions
2. Action Design
- Semantic naming: Use descriptive action names (
startEditing
vssetMode
) - Atomic updates: Each action should do one specific thing
- Validation: Include input validation in actions
- Error handling: Set error state for failed operations
3. Component Integration
- Extract selectively: Only extract the state and actions you need
- Use selectors: Create computed values with
useMemo
for complex filtering - Handle side effects: Use
useEffect
for animations and API calls - Clean up: Remove listeners and cancel pending operations
4. Performance Tips
- Avoid object recreation: Use callbacks to update nested state
- Memoize selectors: Use
useMemo
for computed values - Split large stores: Consider separate stores for independent concerns
- Use shallow equality: Be mindful of reference equality in React renders
How to create views
- Do not define new components, only use ones that already exist in the
src/components/
directory - Use
div
s andtailwind
to appropriately display components - Always use semantic HTML elements (e.g.,
main
,section
,article
,nav
) for better accessibility - All props should be explicitly typed with TypeScript
- Provide default values for optional props to prevent runtime errors
- Use descriptive prop names that indicate both type and purpose (e.g.,
isVisible
notflag
)
Tailwind Usage
- Use only core Tailwind utility classes, no custom values
- Follow mobile-first responsive design using sm:, md:, lg: breakpoints
- Use semantic color classes (e.g., text-primary, bg-secondary) over literal colors
- Maintain consistent spacing scale using Tailwind's default spacing units
State Management
- Use proper React hooks for lifecycle management (useEffect, useMemo, useCallback)
- All state that needs to be synchronized across clients should use keepsync stores
- All state that is relevant to the view and doesn't need to synchronize may simply call on useState
- All external functionality not related to rendering should be in a module
- Document all logic with explicit comments
Accessibility
- Include ARIA labels and roles where appropriate
- Maintain proper heading hierarchy (h1 -> h6)
- Ensure sufficient color contrast using Tailwind's built-in colors
- Add keyboard navigation support for interactive elements
Code Style
- Use explicit return statements for complex render logic
- Add JSDoc comments for component props and important functions
- Include example usage in comments for non-obvious implementations
Examples
Basic View Structure
Start with proper imports and component setup:
/**
* ProfileSettings view component
*
* A view for managing user profile settings including personal information,
* account preferences, and notification settings.
*
* @example
* <ProfileSettings userId="user-123" isEditable={true} />
*/
import React, { useState, useEffect, useMemo } from 'react';
import { useProfileStore } from '../stores/profileStore';
import { Avatar } from '../components/Avatar';
import { TextField } from '../components/TextField';
import { Button } from '../components/Button';
interface ProfileSettingsProps {
userId: string;
isEditable?: boolean;
}
export const ProfileSettings: React.FC<ProfileSettingsProps> = ({
userId,
isEditable = true
}) => {
// Component implementation here...
};
State Management Pattern
Use keepsync stores for synchronized data and local state for UI-specific data:
export const ProfileSettings: React.FC<ProfileSettingsProps> = ({
userId,
isEditable = true
}) => {
// Use store for synchronized state across clients
// This data is automatically synced with the database by the store
const { profile, updateProfile } = useProfileStore();
// Use local state for form data (unsaved changes)
const [formData, setFormData] = useState({
name: '',
email: '',
bio: '',
avatarUrl: ''
});
// UI-specific state that doesn't need to be synced
const [isSaving, setIsSaving] = useState(false);
const [errorMessage, setErrorMessage] = useState<string | null>(null);
const [notifications, setNotifications] = useState({
email: true,
push: false,
sms: false
});
// Load profile data on component mount or when userId changes
useEffect(() => {
if (profile) {
setFormData({
name: profile.name,
email: profile.email,
bio: profile.bio || '',
avatarUrl: profile.avatarUrl || ''
});
setNotifications({
email: profile.notificationPreferences?.email || false,
push: profile.notificationPreferences?.push || false,
sms: profile.notificationPreferences?.sms || false
});
}
}, [profile, userId]);
};
Event Handlers
Implement clean event handlers with proper error handling:
const handleSave = async () => {
if (!isEditable) return;
setIsSaving(true);
setErrorMessage(null);
try {
// Validate form data
if (!formData.name.trim()) {
throw new Error('Name is required');
}
if (!formData.email.trim()) {
throw new Error('Email is required');
}
// Update profile through the store (automatically synced)
await updateProfile({
...formData,
notificationPreferences: notifications
});
console.log('Profile updated successfully');
} catch (error) {
setErrorMessage(error instanceof Error ? error.message : 'Failed to save profile');
console.error('Profile update failed:', error);
} finally {
setIsSaving(false);
}
};
const handleInputChange = (field: keyof typeof formData) => (value: string) => {
setFormData(prev => ({
...prev,
[field]: value
}));
};
const handleNotificationChange = (type: keyof typeof notifications) => (enabled: boolean) => {
setNotifications(prev => ({
...prev,
[type]: enabled
}));
};
Render Logic
Structure your JSX with proper semantic HTML and Tailwind classes:
return (
<main className="max-w-2xl mx-auto p-6 bg-white rounded-lg shadow-sm">
<header className="mb-8">
<h1 className="text-2xl font-semibold text-gray-900 mb-2">
Profile Settings
</h1>
<p className="text-gray-600">
Manage your account information and preferences
</p>
</header>
{errorMessage && (
<div className="mb-6 p-4 bg-red-50 border border-red-200 rounded-md">
<p className="text-red-700 text-sm">{errorMessage}</p>
</div>
)}
<form onSubmit={(e) => { e.preventDefault(); handleSave(); }}>
<section className="mb-8">
<h2 className="text-lg font-medium text-gray-900 mb-4">
Personal Information
</h2>
<div className="space-y-4">
<div className="flex items-center gap-4 mb-6">
<Avatar
src={formData.avatarUrl}
alt={formData.name}
size="large"
className="flex-shrink-0"
/>
<div>
<h3 className="text-sm font-medium text-gray-900">Profile Photo</h3>
<p className="text-xs text-gray-500">
Upload a new photo or change your existing one
</p>
</div>
</div>
<TextField
label="Full Name"
value={formData.name}
onChange={handleInputChange('name')}
placeholder="Enter your full name"
required
disabled={!isEditable}
/>
<TextField
label="Email Address"
type="email"
value={formData.email}
onChange={handleInputChange('email')}
placeholder="Enter your email"
required
disabled={!isEditable}
/>
<TextField
label="Bio"
value={formData.bio}
onChange={handleInputChange('bio')}
placeholder="Tell us about yourself"
multiline
rows={3}
disabled={!isEditable}
/>
</div>
</section>
{isEditable && (
<div className="flex justify-end gap-3">
<Button
type="button"
variant="secondary"
onClick={() => {
// Reset form to original values
if (profile) {
setFormData({
name: profile.name,
email: profile.email,
bio: profile.bio || '',
avatarUrl: profile.avatarUrl || ''
});
}
}}
disabled={isSaving}
>
Cancel
</Button>
<Button
type="submit"
variant="primary"
loading={isSaving}
disabled={isSaving}
>
Save Changes
</Button>
</div>
)}
</form>
</main>
);
Best Practices Summary
- Component Structure: Start with proper TypeScript interfaces and JSDoc comments
- State Management: Use keepsync stores for synchronized data, local state for UI data
- Event Handlers: Keep them clean with proper error handling and validation
- Accessibility: Use semantic HTML elements and proper ARIA attributes
- Styling: Follow Tailwind best practices with consistent spacing and responsive design
- Error Handling: Provide clear error messages and loading states for better UX
How to create server endpoints
- Use the server to handle operations that cannot be performed in the browser due to security restrictions
- All endpoints should be explicitly typed with TypeScript interfaces for request/response
- Provide proper error handling with appropriate HTTP status codes
- Use environment variables for sensitive configuration (API keys, database URLs, etc.)
- Always validate and sanitize input data before processing
Server Architecture
- CRITICAL: You MUST use
ExpressWithRouteTracking
class instead of standard Express for automatic route discovery - WARNING: If you use standard Express instead of
ExpressWithRouteTracking
, the API proxy functionality will break and your endpoints will not work properly - WARNING: ALL SERVERS MUST INCLUDE A /ping ENDPOINT FOR HEALTH CHECKS, OTHERWISE THEY WILL FAIL, this should be there by default, so DO NOT DELETE
- Place all endpoint handlers in the
src/index.ts
file for simplicity - Use middleware for common functionality like authentication, logging, and validation
- Implement proper CORS handling for cross-origin requests
Route Discovery for Hosted Platform
Why ExpressWithRouteTracking is Required
The
ExpressWithRouteTracking
class is essential for the hosted platform to automatically discover and configure your API endpoints. When your application is deployed:
- The platform scans your server code to identify all registered routes
- It automatically generates nginx configuration to proxy requests to your endpoints
- Without route tracking, the platform cannot detect your API endpoints and they will not be accessible
This is why you must use
ExpressWithRouteTracking
instead of standard Express - it provides the route metadata that the hosting infrastructure requires to properly route requests to your application.
Security Best Practices
- Never expose sensitive information in response bodies or logs
- Use environment variables for all API keys, database credentials, and secrets
- Validate all input parameters and sanitize data to prevent injection attacks
- Implement rate limiting for public endpoints to prevent abuse
- Use HTTPS in production and validate SSL certificates
CORS and External Services
- Enable CORS for browser-based requests from your frontend
- Use the server as a proxy to external APIs to hide API keys from the client
- Implement proper error handling for external service failures
- Cache external API responses when appropriate to improve performance
- Use environment variables to configure external service endpoints
File System and OS Operations
- Use Node.js built-in modules (fs, path, os) for file system operations
- Implement proper error handling for file operations (permissions, not found, etc.)
- Use absolute paths with proper path resolution for file operations
- Implement file upload/download endpoints with size limits and type validation
- Use streaming for large file operations to prevent memory issues
Database Operations
- Use environment variables for database connection strings
- Implement connection pooling for database operations
- Use parameterized queries to prevent SQL injection
- Implement proper transaction handling for multi-step operations
- Add database connection health checks and error recovery
Error Handling
- Use consistent error response format across all endpoints
- Log errors with appropriate detail level (debug, info, warn, error)
- Return appropriate HTTP status codes (400, 401, 403, 404, 500, etc.)
- Provide meaningful error messages for client consumption
- Implement global error handling middleware
Code Style
- Use explicit TypeScript types for all request/response objects
- Add JSDoc comments for endpoint descriptions and parameters
- Include example usage in comments for complex endpoints
- Use descriptive variable names that indicate purpose and type
- Implement proper logging with structured data
Examples
External API Proxy with Secret Management
/**
* Weather API proxy endpoint
*
* Proxies requests to external weather service while hiding API key.
* Handles rate limiting, caching, and error transformation.
*
* @example
* GET /api/weather?city=London&country=UK
* Response: { temperature: 15, condition: "cloudy", humidity: 75 }
*/
import { Request, Response } from 'express';
interface WeatherRequest {
city: string;
country?: string;
units?: 'metric' | 'imperial';
}
interface WeatherResponse {
temperature: number;
condition: string;
humidity: number;
timestamp: string;
}
interface ExternalWeatherResponse {
main: {
temp: number;
humidity: number;
};
weather: Array<{
main: string;
description: string;
}>;
}
// Validate and sanitize input parameters
const validateWeatherRequest = (query: any): WeatherRequest => {
const { city, country, units } = query;
if (!city || typeof city !== 'string' || city.trim().length === 0) {
throw new Error('City parameter is required and must be a non-empty string');
}
if (country && typeof country !== 'string') {
throw new Error('Country parameter must be a string');
}
if (units && !['metric', 'imperial'].includes(units)) {
throw new Error('Units parameter must be either "metric" or "imperial"');
}
return {
city: city.trim(),
country: country?.trim(),
units: units || 'metric'
};
};
// Transform external API response to internal format
const transformWeatherResponse = (externalData: ExternalWeatherResponse): WeatherResponse => {
return {
temperature: Math.round(externalData.main.temp),
condition: externalData.weather[0]?.main.toLowerCase() || 'unknown',
humidity: externalData.main.humidity,
timestamp: new Date().toISOString()
};
};
// Weather endpoint implementation
app.get('/api/weather', async (req: Request, res: Response) => {
try {
// Validate and sanitize input
const weatherRequest = validateWeatherRequest(req.query);
// Get API key from environment (hidden from client)
const apiKey = process.env.WEATHER_API_KEY;
if (!apiKey) {
console.error('Weather API key not configured');
return res.status(500).json({
error: 'Weather service temporarily unavailable'
});
}
// Build external API URL with secret key
const baseUrl = 'https://api.openweathermap.org/data/2.5/weather';
const params = new URLSearchParams({
q: weatherRequest.country
? `${weatherRequest.city},${weatherRequest.country}`
: weatherRequest.city,
units: weatherRequest.units,
appid: apiKey // API key hidden from client
});
// Make external API request
const response = await fetch(`${baseUrl}?${params}`);
if (!response.ok) {
// Handle external API errors
if (response.status === 401) {
console.error('Invalid weather API key');
return res.status(500).json({
error: 'Weather service configuration error'
});
}
if (response.status === 404) {
return res.status(404).json({
error: 'City not found'
});
}
throw new Error(`Weather API error: ${response.status}`);
}
const externalData: ExternalWeatherResponse = await response.json();
// Transform and return response
const weatherData = transformWeatherResponse(externalData);
res.json(weatherData);
} catch (error) {
console.error('Weather endpoint error:', error);
if (error instanceof Error && error.message.includes('City parameter')) {
return res.status(400).json({ error: error.message });
}
res.status(500).json({
error: 'Failed to fetch weather data'
});
}
});
File System Operations
/**
* File upload endpoint
*
* Handles file uploads with size limits, type validation, and secure storage.
* Uses environment variables for configuration and hides file system paths.
*
* @example
* POST /api/upload
* Content-Type: multipart/form-data
* Body: file (image/jpeg, max 5MB)
*/
import multer from 'multer';
import { promises as fs } from 'fs';
import path from 'path';
import crypto from 'crypto';
interface UploadResponse {
fileId: string;
filename: string;
size: number;
mimeType: string;
uploadedAt: string;
}
// Configure multer for file uploads
const upload = multer({
storage: multer.memoryStorage(),
limits: {
fileSize: parseInt(process.env.MAX_FILE_SIZE || '5242880'), // 5MB default
files: 1
},
fileFilter: (req, file, cb) => {
// Validate file types
const allowedTypes = ['image/jpeg', 'image/png', 'image/gif'];
if (!allowedTypes.includes(file.mimetype)) {
return cb(new Error('Invalid file type. Only JPEG, PNG, and GIF are allowed.'));
}
cb(null, true);
}
});
// Generate secure filename to prevent path traversal
const generateSecureFilename = (originalName: string): string => {
const timestamp = Date.now();
const randomBytes = crypto.randomBytes(8).toString('hex');
const extension = path.extname(originalName);
return `${timestamp}-${randomBytes}${extension}`;
};
// File upload endpoint
app.post('/api/upload', upload.single('file'), async (req: Request, res: Response) => {
try {
if (!req.file) {
return res.status(400).json({ error: 'No file provided' });
}
// Validate file size
const maxSize = parseInt(process.env.MAX_FILE_SIZE || '5242880');
if (req.file.size > maxSize) {
return res.status(400).json({
error: `File too large. Maximum size is ${maxSize / 1024 / 1024}MB`
});
}
// Generate secure filename
const secureFilename = generateSecureFilename(req.file.originalname);
// Get upload directory from environment
const uploadDir = process.env.UPLOAD_DIR || path.join(process.cwd(), 'uploads');
// Ensure upload directory exists
await fs.mkdir(uploadDir, { recursive: true });
// Save file to disk
const filePath = path.join(uploadDir, secureFilename);
await fs.writeFile(filePath, req.file.buffer);
// Generate file ID for tracking
const fileId = crypto.randomUUID();
const response: UploadResponse = {
fileId,
filename: secureFilename,
size: req.file.size,
mimeType: req.file.mimetype,
uploadedAt: new Date().toISOString()
};
res.status(201).json(response);
} catch (error) {
console.error('File upload error:', error);
if (error instanceof Error && error.message.includes('Invalid file type')) {
return res.status(400).json({ error: error.message });
}
res.status(500).json({ error: 'Failed to upload file' });
}
});
Database Operations with Environment Variables
/**
* User data endpoint
*
* Handles user data operations with database connection using environment variables.
* Implements proper connection pooling and error handling.
*
* @example
* GET /api/users/123
* Response: { id: "123", name: "John Doe", email: "john@example.com" }
*/
import { Pool } from 'pg';
interface User {
id: string;
name: string;
email: string;
createdAt: string;
}
interface CreateUserRequest {
name: string;
email: string;
}
// Database connection pool using environment variables
const pool = new Pool({
connectionString: process.env.DATABASE_URL,
ssl: process.env.NODE_ENV === 'production' ? { rejectUnauthorized: false } : false,
max: 20, // Maximum number of connections
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000,
});
// Validate user input
const validateCreateUserRequest = (body: any): CreateUserRequest => {
const { name, email } = body;
if (!name || typeof name !== 'string' || name.trim().length === 0) {
throw new Error('Name is required and must be a non-empty string');
}
if (!email || typeof email !== 'string' || !email.includes('@')) {
throw new Error('Valid email address is required');
}
return {
name: name.trim(),
email: email.trim().toLowerCase()
};
};
// Get user by ID
app.get('/api/users/:id', async (req: Request, res: Response) => {
const { id } = req.params;
try {
// Validate user ID
if (!id || typeof id !== 'string') {
return res.status(400).json({ error: 'Valid user ID is required' });
}
// Query database with parameterized query
const query = 'SELECT id, name, email, created_at FROM users WHERE id = $1';
const result = await pool.query(query, [id]);
if (result.rows.length === 0) {
return res.status(404).json({ error: 'User not found' });
}
const user: User = {
id: result.rows[0].id,
name: result.rows[0].name,
email: result.rows[0].email,
createdAt: result.rows[0].created_at.toISOString()
};
res.json(user);
} catch (error) {
console.error('Database query error:', error);
res.status(500).json({ error: 'Failed to fetch user data' });
}
});
// Create new user
app.post('/api/users', async (req: Request, res: Response) => {
try {
// Validate request body
const userData = validateCreateUserRequest(req.body);
// Check if email already exists
const existingUser = await pool.query(
'SELECT id FROM users WHERE email = $1',
[userData.email]
);
if (existingUser.rows.length > 0) {
return res.status(409).json({ error: 'Email already registered' });
}
// Insert new user with parameterized query
const insertQuery = `
INSERT INTO users (name, email, created_at)
VALUES ($1, $2, NOW())
RETURNING id, name, email, created_at
`;
const result = await pool.query(insertQuery, [userData.name, userData.email]);
const newUser: User = {
id: result.rows[0].id,
name: result.rows[0].name,
email: result.rows[0].email,
createdAt: result.rows[0].created_at.toISOString()
};
res.status(201).json(newUser);
} catch (error) {
console.error('Create user error:', error);
if (error instanceof Error && error.message.includes('required')) {
return res.status(400).json({ error: error.message });
}
res.status(500).json({ error: 'Failed to create user' });
}
});
Environment Variables Setup
Create a .env
file in the project root with the following structure:
# Server Configuration
PORT=6080
NODE_ENV=development
# External API Keys (hidden from client)
WEATHER_API_KEY=your_openweathermap_api_key
STRIPE_SECRET_KEY=your_stripe_secret_key
GOOGLE_MAPS_API_KEY=your_google_maps_api_key
# Database Configuration
DATABASE_URL=postgresql://username:password@localhost:5432/database_name
# File Upload Configuration
MAX_FILE_SIZE=5242880
UPLOAD_DIR=/path/to/secure/upload/directory
# Security Configuration
JWT_SECRET=your_jwt_secret_key
SESSION_SECRET=your_session_secret
Global Project Instructions
Node Modules
- Use
npm
for package management - Whenever an action requires an external Node module, always read the
package.json
first to check what modules you have at your disposal - If a module you think you need does not exist in
package.json
, do not use arbitrary modules- Come up with an alternative strategy to implement the task.
- If you think it is absolutely not possible, inform the user with a clear explanation that the task will require a node module.
Linting
- Put
_
ahead of parameters that aren't used in a function (e.x.,_req
in the following)
router.get('/health', (_req, res) => {
res.json({ status: 'ok' });
});
The tonk.config.json file includes useful information about the project and the project purpose.
IMPORTANT: Synchronization is best performed through the keepsync library. You must load the instructions for keepsync by reading instructions/keepsync/llms.txt and looking at the examples in instructions/keepsync/examples
DO NOT EDIT - AUTO-GENERATED FROM docs/src/llms/templates/worker/README.md
This file is automatically generated from the documentation.
Edit the source file instead: docs/src/llms/templates/worker/README.md
Generated on: 2025-07-17T13:11:10.055Z
File Listening Pattern - IMPORTANT
When implementing file watching functionality, ALWAYS use the existing FileListener pattern from src/listeners/fileListener.ts
. Do NOT create custom file watcher services.
Proper FileListener Usage:
-
Import the FileListener: Use
import { FileListener, createAndStartFileListener } from "./listeners/fileListener";
-
Define your data transformation: Create a transformer function that converts file content to your desired format
-
Define path transformation: Create a function that maps file paths to keepsync document paths
-
Define data mapper: Create a mapper function that handles how the transformed data gets written to keepsync
-
Use the pattern: Initialize the FileListener with your configuration
Example Implementation:
import { createAndStartFileListener } from "./listeners/fileListener";
// Note: Instead of creating services/ directory, add API routes to the /server proxy in the react template
// The server is located at server/src/index.ts and handles API endpoints
// In your main initialization:
const fileListener = await createAndStartFileListener(
'/path/to/obsidian/file.md',
(filePath) => 'obsidian/topics', // keepsync path
(fileContent, filePath) => parseObsidianContent(fileContent), // transformer
(existingDoc, newData) => ({ ...existingDoc, ...newData }) // mapper
);
Why This Pattern?
- Consistency: All workers use the same file watching approach
- Reliability: The FileListener handles edge cases, debouncing, and error recovery
- Integration: Built-in keepsync integration with proper data mapping
- Monitoring: Standardized logging and error handling
Remember: The FileListener already handles chokidar setup, file reading, error handling, and keepsync integration. Focus on your business logic (data transformation) rather than file watching infrastructure.
DO NOT EDIT - AUTO-GENERATED FROM docs/src/llms/templates/workspace/README.md
This file is automatically generated from the documentation.
Edit the source file instead: docs/src/llms/templates/workspace/README.md
Generated on: 2025-07-17T13:11:10.059Z
What is a Workspace?
A workspace is a structured development environment designed to function as an exocortex or second brain - a conversational data platform where users describe their data needs in natural language, and a coding agent materializes the technical solution. It serves as a comprehensive platform where you can pull data from various sources, run it through processing workflows, and visualize results through React applications.
Workspace Structure
A workspace follows a specific organizational structure with four main directories:
-
/console
— A React app that provides a file browser showing document state in keepsync stores, plus a worker tab with up-to-date information on which workers have run, when they ran, and what data was moved -
/instructions
— Contains instructions and guidelines for coding agents. Instructions are co-located with functionality: project-level instructions in/instructions/
, view-specific guidance in/views/llms.txt
, worker patterns in/workers/llms.txt
, etc. -
/views
— Storage location for Tonk apps (or views) created usingtonk-create
. These are interactive React applications that visualize data and provide user interfaces -
/workers
— Contains Tonk workers created withtonk-create
. These handle background processing, data ingestion, transformation, and can listen to keepsync stores or file systems for automated workflows
Agent Interaction Model
Command-Line Assistant: The agent interacts through command-line tool use, acting as a conversational assistant that helps users build data processing pipelines. The agent should:
- Guide users through the process of creating views and workers
- Vibecode implementations (generate actual code on behalf of the user)
- Suggest specific parsing libraries and technical approaches
- Ask clarifying questions to disambiguate user intent
- Reference co-located instructions (e.g.,
/workers/llms.txt
for keepsync listening patterns)
Data Storage: Keepsync
The workspace uses keepsync as its data backbone. Keepsync handles data storage and synchronization between components. Workers read from and write to keepsync stores, creating flows of data that views can then visualize. The four directories don't communicate directly - they're organized for convenience and quick access to context-specific information.
Typical Workflow Example
Here's how users typically build data processing pipelines:
- Ingest: Create a worker to bring in data (e.g., Google data) → store in keepsync
- Visualize: Create a view to display that data
- Transform: Create another worker to read keepsync data → transform into summaries (e.g., OpenAI integration) → store summaries
- Visualize: Update view to show summaries
- Cross-reference: Create worker to watch local files → transform and store in keepsync
- Join: Create worker to listen to multiple keepsync stores → perform joins → store results
- Visualize: Create view showing cross-referenced summaries
This creates flows of data and visualizations over the flows - an iterative process where the workspace grows organically through cycles of ingestion, transformation, and visualization.
Agent Guidelines
File Format Handling: If a file format isn't currently handled, guide the user to:
- Create a worker using
tonk-create -t worker -n <name> -d <description of the worker>
- Vibecode the worker to parse the file format (suggest specific parsing libraries)
- Store parsed data in keepsync
- Create a view to visualize the data
File Listening Pattern - IMPORTANT
When implementing file watching functionality, ALWAYS use the existing FileListener pattern from src/listeners/fileListener.ts
. Do NOT create custom file watcher services.
Proper FileListener Usage:
-
Import the FileListener: Use
import { FileListener, createAndStartFileListener } from "./listeners/fileListener";
-
Define your data transformation: Create a transformer function that converts file content to your desired format
-
Define path transformation: Create a function that maps file paths to keepsync document paths
-
Define data mapper: Create a mapper function that handles how the transformed data gets written to keepsync
-
Use the pattern: Initialize the FileListener with your configuration
Example Implementation:
import { createAndStartFileListener } from "./listeners/fileListener";
// Note: Instead of creating services/ directory, add API routes to the /server proxy in the react template
// The server is located at server/src/index.ts and handles API endpoints
// In your main initialization:
const fileListener = await createAndStartFileListener(
'/path/to/obsidian/file.md',
(filePath) => 'obsidian/topics', // keepsync path
(fileContent, filePath) => parseObsidianContent(fileContent), // transformer
(existingDoc, newData) => ({ ...existingDoc, ...newData }) // mapper
);
Why This Pattern?
- Consistency: All workers use the same file watching approach
- Reliability: The FileListener handles edge cases, debouncing, and error recovery
- Integration: Built-in keepsync integration with proper data mapping
- Monitoring: Standardized logging and error handling
Remember: The FileListener already handles chokidar setup, file reading, error handling, and keepsync integration. Focus on your business logic (data transformation) rather than file watching infrastructure.
Instructions Reference: Always check relevant llms.txt
files for component-specific patterns and guidelines before vibecoding implementations.
Tonk CLI Commands
The Tonk CLI provides essential commands for managing your workspace and applications:
Main CLI (tonk
)
tonk -d
— Run the Tonk daemon (background server for managing bundles and stores)tonk hello
— Say hello to start and launch the tonk daemontonk auth
— Log in to your Tonk accounttonk deploy
— Deploy a Tonk bundle to an existing servertonk server
— Manage Tonk servers
Bundle Management
tonk push
— Package, upload, build and start a bundle on the Tonk servertonk start <bundleName>
— Start a bundle on a routetonk ps
— List running bundlestonk ls
— List available bundles on the Tonk servertonk kill <serverId>
— Stop a running bundle servertonk delete <bundleName>
— Delete a bundle from the server (alias:rm
)tonk proxy <bundleName>
— Create a reverse proxy to access a Tonk bundle
Worker Management (tonk worker
)
tonk worker ls
— List all registered workerstonk worker inspect <nameOrId>
— Inspect a specific workertonk worker start <nameOrId>
— Start a workertonk worker stop <nameOrId>
— Stop a workertonk worker rm <nameOrId>
— Remove a registered workertonk worker ping <nameOrId>
— Ping a worker to check its statustonk worker logs <nameOrId>
— View logs for a workertonk worker register [dir]
— Register a worker with Tonktonk worker install <package>
— Install and start a worker from npmtonk worker init
— Initialize a new worker configuration file
Create Command (tonk-create
)
The create command scaffolds code for your Tonk projects:
Usage: tonk-create [options]
Options:
-v, --version
— Output the current version-i, --init
— Initialize in the current folder (instead of creating new directory)-t, --template <type>
— Template type:react
,worker
, orworkspace
-n, --name <name>
— Project name-d, --description <description>
— Project description
Template Types:
react
— "Create apps with your data" - Interactive React applications for data visualizationworker
— "Retrieve data to use later" - Background processing and data ingestionworkspace
— "Organize multiple projects" - Complete development environment structure
Examples:
# Non-interactive mode
tonk-create -t react -n my-dashboard -d "Sales data visualization"
tonk-create --init -t workspace # Initialize workspace in current directory
Daemon Mode and Server Functionality
The daemon (tonk -d
) provides the core infrastructure:
- Bundle Storage: Manages code bundles and their lifecycle
- Data Stores: Maintains keepsync stores for data synchronization
- Process Coordination: Handles running workers and serving React apps
- Worker Registry: Manages registered workers and their states
- Server Infrastructure: Runs TonkServer for handling requests
LLM Agent Workspace Navigation
When working within a Tonk workspace, the LLM should understand these interaction patterns:
Project Creation Workflow
- Understand Requirements: Determine if user needs data ingestion (worker), visualization (react), or full environment (workspace)
- Guide CLI Usage: Use
tonk-create
with appropriate template and meaningful names - Implement Logic: Vibecode the functionality using established patterns and libraries
- Register and Start: For workers, use
tonk worker register
andtonk worker start
to activate them
Worker Management Patterns
- Development Cycle: Create with
tonk-create -t worker -n <name> -d <description>
, register withtonk worker register
, start withtonk worker start
- Debugging: Use
tonk worker logs <worker>
to view output andtonk worker ping <worker>
to check status - Monitoring: Check
tonk worker ls
to see all registered workers and their states - Cleanup: Use
tonk worker stop <worker>
andtonk worker rm <worker>
to remove unused workers
Development Best Practices
- Check Existing Patterns: Look at
/workers/
and/views/
for similar implementations - Follow Conventions: Use established libraries and coding patterns from the workspace
- Reference Instructions: Always check relevant
llms.txt
files for component-specific guidance - Use FileListener Pattern: For file watching, always use the existing FileListener from
src/listeners/fileListener.ts
Troubleshooting and Debugging
- Daemon Status: Ensure
tonk -d
is running for all workspace operations - Worker Status: Use
tonk worker ls
andtonk worker ping <worker>
to monitor worker health - Logs: Check
tonk worker logs <worker>
for error messages and debugging information - Data Flow: Verify keepsync stores through the console app for data debugging
Core Philosophy
The workspace transforms traditional development from static code creation into a conversational data platform. Users don't need to understand keepsync APIs or worker patterns - they describe what they want in natural language, and the agent handles technical implementation through vibecoding. The boundary between user intent and technical execution becomes fluid, with the agent serving as both consultant and implementer.
Key Characteristics
Agent-Centric Design: The workspace is designed to be primarily updated and managed through coding agents. Rather than manual file manipulation, a coding agent serves as your primary assistant, understanding the workspace structure and making intelligent modifications.
Interactive Data Pipeline: The workspace enables a complete data processing pipeline:
- Data Ingestion - Pull information from different sources
- Processing - Run data through transformation and analysis workflows
- Visualization - Display results in custom React applications
- Feedback Loop - React apps can collect user input and update the underlying data
Living System: Unlike static codebases, a workspace is designed to evolve and respond. The React apps within it can take feedback and input from users, creating a dynamic system that learns and adapts based on interaction.
Use Cases
A workspace is ideal for:
- Building personal knowledge management systems
- Creating data analysis and visualization dashboards
- Developing interactive tools for research and exploration
- Prototyping applications that need to process and display complex data
- Building systems that combine automation (workers) with human interaction (views)
The workspace concept transforms traditional development from static code creation into dynamic, agent-assisted system building where the boundary between tool and user becomes fluid.
Global Project Instructions
Node Modules
- Use
npm
for package management - Whenever an action requires an external Node module, always read the
package.json
first to check what modules you have at your disposal - If a module you think you need does not exist in
package.json
, do not use arbitrary modules- Come up with an alternative strategy to implement the task.
- If you think it is absolutely not possible, inform the user with a clear explanation that the task will require a node module.
Linting
- Put
_
ahead of parameters that aren't used in a function (e.x.,_req
in the following)
router.get('/health', (_req, res) => {
res.json({ status: 'ok' });
});
The tonk.config.json file includes useful information about the project and the project purpose.
IMPORTANT: Synchronization is best performed through the keepsync library. You must load the instructions for keepsync by reading instructions/keepsync/llms.txt and looking at the examples in instructions/keepsync/examples
DO NOT EDIT - AUTO-GENERATED FROM docs/src/llms/templates/workspace/views.md
This file is automatically generated from the documentation.
Edit the source file instead: docs/src/llms/templates/workspace/views.md
Generated on: 2025-07-17T13:11:10.057Z
Tonk View Architecture and Usage Guide
Overview
Tonk views are React-based frontend applications that integrate with the Tonk ecosystem for building local-first applications. They provide modern web UI/UX with real-time data synchronization using KeepSync and Automerge CRDTs, supporting offline-first collaborative experiences.
Getting Started
The view structure described in this guide is automatically created when you run tonk-create -t react -n <name> -d <description>
. This command will guide you through the process of creating a new view and scaffold a complete React application template folder with all the necessary files and configurations.
To create a new view:
tonk-create -t react -n <name> -d <description>
The CLI will prompt you to select "react" as the project type and guide you through the setup process, generating the folder structure and files detailed below.
View Structure
Core Files and Directories
view/
├── src/
│ ├── index.tsx # Main entry point with React app initialization
│ ├── App.tsx # Root React component with routing
│ ├── index.css # Global styles
│ ├── views/ # Page-level view components
│ ├── components/ # Reusable UI components
│ ├── stores/ # Zustand state management stores
│ └── # Note: Instead of services/ directory, add API routes to the /server proxy
│ └── # The server is located at server/src/index.ts and handles API endpoints
├── server/ # Express.js backend server
│ ├── src/
│ │ └── index.ts # Server entry point with API routes
│ ├── package.json # Server-specific dependencies
│ └── tsconfig.json # Server TypeScript configuration
├── instructions/ # LLM-readable documentation
├── public/ # Static assets (favicon, manifest, etc.)
├── scripts/ # Build and deployment scripts
├── package.json # Frontend package configuration
├── tonk.config.json # Tonk platform configuration
├── vite.config.ts # Vite build configuration
├── tailwind.config.cjs # Tailwind CSS configuration
├── tsconfig.json # TypeScript configuration
└── index.html # HTML template
Key Dependencies
react
&react-dom
: Core React framework@tonk/keepsync
: Core data synchronization@automerge/automerge-repo
: CRDT document management@automerge/automerge-repo-network-websocket
: WebSocket networking@automerge/automerge-repo-storage-indexeddb
: Browser storagereact-router-dom
: Client-side routingzustand
: Lightweight state managementvite
: Fast build tool and dev servertailwindcss
: Utility-first CSS frameworklucide-react
: Icon libraryexpress
: Backend server frameworkcors
: Cross-origin resource sharing
Configuration Files
tonk.config.json
Tonk platform integration:
- View name and description
- Platform type ("react")
- Template reference
- Project metadata
vite.config.ts
Build and development configuration:
- Plugins: React, PWA, WebAssembly, top-level await
- Dev Server: Port (3000), proxy configuration for sync and API
- Build: Output directory, asset management, code splitting
- Optimizations: ESNext target, vendor chunking
package.json
Frontend application configuration:
- Scripts: Development, build, preview, test, deployment
- Workspaces: Includes server subdirectory
- Dependencies: React ecosystem, Tonk/KeepSync, UI libraries
- Dev Dependencies: Build tools, TypeScript, testing framework
Environment Configuration
- Development: Vite dev server on port 3000
- Sync Proxy: WebSocket proxy to
ws://localhost:7777/sync
- API Proxy: HTTP proxy to
http://localhost:6080/api
- PWA: Progressive Web App capabilities enabled
View Architecture Patterns
React App Initialization
Views initialize with:
- KeepSync engine configuration
- WebSocket client adapter setup
- IndexedDB storage adapter
- React router integration
- Strict mode for development
KeepSync Integration
Views configure sync engines with:
- Browser WebSocket adapters for real-time sync
- IndexedDB storage for offline persistence
- Document read/write operations from components
- Automatic reconnection handling
- Cross-tab synchronization
Component Architecture
- Views: Page-level components with routing
- Components: Reusable UI elements with proper separation
- Stores: Zustand stores for global state management
- Services: Data fetching and external API integration
- Hooks: Custom React hooks for KeepSync operations
Data Flow Patterns
- Component mounts and subscribes to KeepSync documents
- Real-time updates received via WebSocket
- Local state updated through Zustand stores
- UI re-renders with new data
- User interactions trigger document updates
- Changes propagated to other connected clients
Development Workflow
Local Development
# CORRECT - Run from within the view directory:
cd view/your-view-name/
1. `pnpm install` - Install all dependencies
2. `pnpm dev` - Start both frontend and backend concurrently
3. `pnpm dev:client` - Start only frontend (port 3000)
4. `pnpm dev:server` - Start only backend server
5. `pnpm build` - Build for production
6. `pnpm serve` - Preview production build
# INCORRECT - Do NOT run from workspace root:
# npm run dev # ❌ This will fail - no dev script in workspace root
cd view/your-view-name/
- Always navigate to the worker directory first
Production Deployment
pnpm build
- Compile and bundletonk deploy
- Deploy to Tonk platform- Static assets served via CDN
- Server handles API routes and proxying
Server Integration
Express.js Backend
Views include integrated Express servers:
- API Routes: Custom business logic endpoints
- Proxy Configuration: Route sync and external API calls
- CORS Support: Cross-origin request handling
- Static Serving: Fallback for production builds
- Error Handling: Structured error responses
Server Structure
server/
├── src/
│ └── index.ts # Express app with routing
├── package.json # Server dependencies
└── tsconfig.json # Server TypeScript config
State Management
Zustand Stores
Lightweight state management with:
- Simple store creation and usage
- TypeScript integration
- Persistence middleware support
- DevTools integration
- Minimal boilerplate
KeepSync Document State
- Document subscriptions in React components
- Automatic re-rendering on document changes
- Optimistic UI updates
- Conflict-free collaborative editing
- Offline state management
UI/UX Patterns
Modern React Patterns
- Functional components with hooks
- TypeScript for type safety
- Suspense for loading states
- Error boundaries for error handling
- React Router for navigation
Styling and Design
- Tailwind CSS: Utility-first styling approach
- Component Libraries: Lucide React for icons
- Responsive Design: Mobile-first approach
- Dark Mode: System preference detection
- Accessibility: ARIA labels and semantic HTML
Progressive Web App Features
- Service Worker: Automatic updates
- Manifest: App-like experience
- Offline Support: Via KeepSync and IndexedDB
- Install Prompt: Add to home screen
- Push Notifications: Real-time updates
Best Practices
Performance Optimization
- Code Splitting: Vendor and feature-based chunks
- Tree Shaking: Unused code elimination
- Lazy Loading: Route-based code splitting
- Bundle Analysis: Size monitoring and optimization
- Caching: Aggressive caching strategies
Error Handling
- Error Boundaries: Component-level error isolation
- Network Resilience: Offline capability and retry logic
- User Feedback: Toast notifications and error states
- Logging: Structured error reporting
- Graceful Degradation: Progressive enhancement
Security
- Input Validation: Client and server-side validation
- CORS Configuration: Proper origin restrictions
- Content Security Policy: XSS protection
- Secure Headers: Security-focused HTTP headers
- Environment Secrets: Proper secret management
Testing
- Unit Tests: Component testing with Jest
- Integration Tests: End-to-end user flows
- Visual Regression: UI consistency testing
- Performance Tests: Bundle size and loading metrics
- Accessibility Tests: WCAG compliance validation
Integration Points
With Tonk Platform
- Deployment: Automated build and deploy pipeline
- Monitoring: Performance and error tracking
- Configuration: Environment-based settings
- Scaling: Auto-scaling based on traffic
With KeepSync
- Real-time Sync: Multi-user collaboration
- Offline First: Local-first data architecture
- Conflict Resolution: Automatic merge strategies
- Version History: Document change tracking
With External Services
- API Integration: RESTful and GraphQL services
- Authentication: OAuth and JWT token handling
- File Uploads: Multi-part form data handling
- Third-party Libraries: NPM ecosystem integration
Example Usage Patterns
KeepSync Document Operations
// Subscribe to document changes
const [data, setData] = useState(null);
useEffect(() => {
const unsubscribe = engine.subscribe(documentPath, setData);
return unsubscribe;
}, []);
// Update document
const updateDocument = (newData) => {
engine.updateDocument(documentPath, newData);
};
Zustand Store Pattern
// Create typed store
const useAppStore = create<AppState>((set) => ({
user: null,
setUser: (user) => set({ user }),
loading: false,
setLoading: (loading) => set({ loading }),
}));
React Router Setup
// App.tsx routing configuration
<Routes>
<Route path="/" element={<HomePage />} />
<Route path="/dashboard" element={<Dashboard />} />
<Route path="/settings" element={<Settings />} />
</Routes>
Component with KeepSync
// View component with real-time data
const Dashboard = () => {
const { data, updateData } = useKeepSyncDocument('dashboard');
return (
<div>
<h1>{data?.title || 'Loading...'}</h1>
<button onClick={() => updateData({ title: 'Updated!' })}>
Update Title
</button>
</div>
);
};
This architecture enables views to serve as modern, collaborative web applications with real-time synchronization, offline capabilities, and seamless integration with the Tonk ecosystem.
DO NOT EDIT - AUTO-GENERATED FROM docs/src/llms/templates/workspace/workers.md
This file is automatically generated from the documentation.
Edit the source file instead: docs/src/llms/templates/workspace/workers.md
Generated on: 2025-07-17T13:11:10.063Z
Tonk Worker Architecture and Usage Guide
Overview
Tonk workers are Node.js-based services that integrate with the Tonk ecosystem for building local-first applications. They provide HTTP API endpoints and integrate with KeepSync for distributed data synchronization using Automerge CRDTs.
Getting Started
The worker structure described in this guide is automatically created when you run tonk-create
.
This command will guide you through the process of creating a new worker and scaffold a complete
worker template folder with all the necessary files and configurations.
IMPORTANT: Always run tonk-create
from inside the workers/
directory to ensure the new worker
is created in the correct location.
To create a new worker:
cd workers/
tonk-create
The CLI will prompt you to select "worker" as the project type and guide you through the setup
process, generating the folder structure and files detailed below. The worker will be created as a
subdirectory within workers/
.
Worker Structure
Core Files and Directories
worker/
├── src/
│ ├── index.ts # Main entry point with HTTP server
│ ├── cli.ts # CLI interface for worker control
│ ├── utils/ # Utility modules
│ ├── listeners/ # Event listeners (file, keepsync)
│ └── io/ # I/O handlers and types
├── instructions/ # LLM-readable documentation
├── creds/ # Credential storage
├── package.json # Node.js package configuration
├── worker.config.js # Worker-specific configuration
├── tonk.config.json # Tonk platform configuration
└── tsconfig.json # TypeScript configuration
Key Dependencies
@tonk/keepsync
: Core data synchronization@automerge/automerge-repo
: CRDT document management@automerge/automerge-repo-network-websocket
: WebSocket networking@automerge/automerge-repo-storage-nodefs
: File system storageexpress
or nativehttp
: HTTP server functionalitycommander
: CLI argument parsingchokidar
: File system watchingdotenv
: Environment variable management
Configuration Files
worker.config.js
Primary worker configuration defining:
- Runtime: Port, health check endpoints, intervals
- Process: File execution, instances, auto-restart, memory limits
- CLI: Script paths and command arguments
- Schemas: Data validation schemas for KeepSync documents
- Environment: Production/development settings
tonk.config.json
Tonk platform integration:
- Worker name and description
- Platform type ("worker")
- Template reference
- Project metadata
Environment Variables
SYNC_WS_URL
: WebSocket URL for KeepSync (default: ws://localhost:7777/sync)SYNC_URL
: HTTP URL for KeepSync (default: http://localhost:7777)WORKER_PORT
: Port for HTTP server (default: 5555)NODE_ENV
: Runtime environment (production/development)
Worker Architecture Patterns
HTTP Server Setup
Workers create HTTP servers with:
- CORS support for cross-origin requests
- Health check endpoints (
/health
) - Main processing endpoints (
/tonk
) - Custom business logic endpoints
- Error handling and logging
- Graceful shutdown handling
KeepSync Integration
Workers configure KeepSync engines with:
- WebSocket network adapters for real-time sync
- File system storage adapters
- Document read/write operations
- Schema validation
- Connection management
Data Flow Patterns
- HTTP requests received at worker endpoints
- Data validation and processing
- KeepSync document operations (read/write)
- Response formatting and delivery
- Error handling and logging
Utility Modules
BaseCredentialsManager ⭐ CRITICAL FOR EXTERNAL API INTEGRATION
ALWAYS use BaseCredentialsManager for any external service authentication (Google APIs, OpenAI, etc.)
Handles secure credential storage and management:
- Interactive credential setup via CLI prompts
- File-based credential storage in
creds/
directory - Validation functions for credential format
- Multiple credential types (API keys, OAuth tokens, certificates, JSON service accounts)
- Automatic headers and URL parameter injection
- Supports both simple API keys and complex JSON credentials (like Google service accounts)
Key Usage Pattern:
// 1. Define credential configurations
const credentialsManager = new BaseCredentialsManager([
{
name: 'Google Service Account',
filename: 'google_service_account.json',
description: 'Google Cloud service account for Gmail API access',
instructions: 'Download from Google Cloud Console > IAM & Admin > Service Accounts',
validationFn: content => {
try {
const parsed = JSON.parse(content);
return { valid: !!parsed.client_email, message: 'Must be valid service account JSON' };
} catch {
return { valid: false, message: 'Must be valid JSON' };
}
},
},
{
name: 'OpenAI API Key',
filename: 'openai_api_key.txt',
description: 'OpenAI API key for topic analysis',
instructions: 'Get from https://platform.openai.com/api-keys',
header: 'Authorization', // Will add as "Authorization: Bearer <key>"
},
]);
// 2. Initialize and check credentials
await credentialsManager.init();
const { complete, missing } = await credentialsManager.checkCredentials();
if (!complete) await credentialsManager.setupCredentials();
// 3. Load credentials in your code
const googleCreds = await credentialsManager.getCredentialByName('Google Service Account');
const openaiKey = await credentialsManager.getCredentialByName('OpenAI API Key');
I/O System
Modular I/O handling with:
- IOHandler Interface: Generic read operations
- IOManager: Multi-scheme URL handling (fs:, http:, keepsync:)
- IOResult: Standardized result wrapper
- DataMapper: Schema transformation
- IOResultMapper: KeepSync document integration
Event Listeners
- File Listeners: File system change monitoring
- KeepSync Listeners: Document change notifications
- Custom Listeners: Business logic event handling
CLI Interface
Workers provide command-line interfaces:
start
: Launch the worker service- Port configuration via
--port
flag - Environment variable override support
- Error handling and process management
Development Workflow
Local Development
⚠️ CRITICAL: Always run worker commands from within the specific worker directory, NOT from the workspace root!
# CORRECT - Run from within the worker directory:
cd workers/your-worker-name/
pnpm install # Install dependencies
pnpm dev # Start with hot reload
pnpm build # Compile TypeScript
pnpm start # Run production build
# INCORRECT - Do NOT run from workspace root:
# npm run dev # ❌ This will fail - no dev script in workspace root
cd workers/your-worker-name/
- Always navigate to the worker directory firstpnpm install
- Install dependenciespnpm dev
- Start with hot reloadpnpm build
- Compile TypeScriptpnpm start
- Run production build
Production Deployment
pnpm build
- Compile for productiontonk worker register
- Register with Tonk platform- Process manager handles lifecycle
- Health checks monitor status
Best Practices
Error Handling
- Global uncaught exception handlers
- Unhandled promise rejection logging
- Graceful HTTP error responses
- Structured error messages
Security
- Credential isolation in separate directory
- Environment variable configuration
- CORS policy management
- Input validation and sanitization
Performance
- Configurable process instances
- Memory limit enforcement
- Auto-restart capabilities
- Connection pooling for external services
Data Management
- Schema-based validation
- Atomic document operations
- Conflict-free data structures (CRDTs)
- Offline-first design principles
Integration Points
With Tonk Platform
- Worker registration and discovery
- Health monitoring and reporting
- Configuration management
- Deployment coordination
With KeepSync
- Document synchronization
- Real-time collaboration
- Offline capability
- Conflict resolution
With External Services
- API integration via credentials
- Webhook handling
- File system operations
- Database connections
Example Usage Patterns
Basic HTTP Endpoint
// Handle POST requests with JSON processing
if (req.method === 'POST' && req.url === '/tonk') {
// Parse JSON body
// Process business logic
// Update KeepSync documents
// Return structured response
}
KeepSync Document Operations
// Configure sync engine
const engine = await configureSyncEngine({
url: SYNC_URL,
network: [wsAdapter],
storage: new NodeFSStorageAdapter(),
});
// Read/write documents
const data = await readDoc(documentPath);
await writeDoc(documentPath, updatedData);
Credential Management
const credentialsManager = new BaseCredentialsManager([
{
name: 'API Key',
filename: 'api_key.txt',
description: 'External service API key',
instructions: 'Obtain from service provider dashboard',
},
]);
This architecture enables workers to serve as integration points between external services and the Tonk ecosystem, providing real-time data synchronization and collaborative capabilities.
DO NOT EDIT - AUTO-GENERATED FROM docs/src/llms/templates/workspace/instructions.md
This file is automatically generated from the documentation.
Edit the source file instead: docs/src/llms/templates/workspace/instructions.md
Generated on: 2025-07-17T13:11:10.060Z
Tonk Workspace Agent Instructions
Overview
You are an AI agent acting as the operating system for a Tonk workspace. A Tonk workspace is a development environment that consists of:
- Console: A React debugging application with file browser and worker monitoring (run with
pnpm dev
in console folder) - Views: React applications that provide user interfaces and come bundled with keepsync library for access
- Workers: Background processes that fetch and store data into keepsync stores (distributed CRDT-based databases addressed using document paths)
- Instructions: Documentation and guidance files
Your role is to help users accomplish their goals by managing these components through Tonk commands. This workspace is primarily focused on development workflows.
What You Can Help With
When users ask what this workspace is for or what you can help them with, explain that you can assist with:
- Creating new views and workers for their applications
- Running and managing views (React-like applications)
- Starting, stopping, and monitoring background workers
- Publishing and sharing applications
- Debugging through the console interface
- Managing the entire development lifecycle
Core Tonk Commands and When to Use Them
1. tonk hello
- This initializes the Tonk daemon which provides synchronization services.
- Must be executed before any other Tonk operations
- If the user is experiencing issues with syncing, it might be because the Tonk daemon is not running
- Starts the local Tonk daemon
2. tonk-create
Use this when users need functionality or data that doesn't already exist in the workspace.
- Ask the user for the name and escription of the worker they are creating
- Choose this when users request data sources (tonk-create -t workers -n
-d ), or UI components (tonk-create -t react -n -d )
3. tonk push
Use to prepare a view bundle for sharing (primarily for use with tonk proxy
).
- Use when you want to share a view temporarily with others
- Packages and uploads application bundles to the Tonk server
- Typically followed by
tonk start
andtonk proxy
for sharing workflows
4. tonk ls
Shows what bundles are currently pushed to the server.
- Use to check available applications
- Helpful for debugging deployment issues
5. tonk ps
Shows what bundles are currently running.
- Use to check which applications are active
- Essential for debugging and monitoring
6. tonk start <bundle-name>
Starts a bundle so it can be served.
- Use after pushing a bundle to make it accessible
- Required to run applications
7. tonk kill <server-id>
Stops a running bundle.
- Use to shut down applications
- Helpful for resource management and debugging
8. tonk proxy <bundle-name>
Creates a temporary reverse proxy for sharing (60-minute limit).
- Use for mobile debugging
- Use for sharing developer previews
- Use for temporary live demonstrations
- Automatically expires after 60 minutes
Worker Management
Worker Lifecycle Options
After creating a worker with tonk-create
, you have two approaches:
Option A: Manual Development
- Run the worker manually using
pnpm dev
in the project directory - Good for active development and debugging
Option B: Background Service
- Register the worker:
tonk worker register [dir]
- Start the worker:
tonk worker start <nameOrId>
- Use this when you want workers running continuously in the background
Option C: On-Demand Endpoint
- Go through the registration fly like in option B
- Create workers that listen on specific endpoints
- Workers only run a job when pinged
- Have the main project hit the endpoint when needed
Worker Commands
tonk worker ls
- List all registered workerstonk worker inspect <nameOrId>
- View worker details and statustonk worker start <nameOrId>
- Start a registered workertonk worker stop <nameOrId>
- Stop a running workertonk worker logs <nameOrId>
- View worker logstonk worker ping <nameOrId>
- Check worker statustonk worker rm <nameOrId>
- Remove a registered workertonk worker install <package>
- Install worker from npmtonk worker init
- Initialize new worker configuration
Decision Framework
When a user requests something, ask yourself:
- Do they need new functionality? → Use
tonk-create
- Do they want to share something temporarily? → Use
tonk push
, thentonk start
, thentonk proxy
- Do they need to manage running services? → You can use typical react commands for local development or for shareable bundles use
tonk ps
,tonk start
,tonk kill
- Do they need background data processing? → Create and register workers
- Do they need more information about system state? → Run console with
pnpm dev
, usetonk worker logs
,tonk ps
Important Notes
- The console is a React app - run it with
pnpm dev
in the console folder for debugging - Views are for user interfaces, workers are for data processing
- This workspace is primarily for development - focus on development workflows
- Workers store data in keepsync stores (CRDT-based distributed databases)
- Detailed worker data handling instructions are provided in worker templates when created
- Workers can be managed manually or as background services depending on needs
Shareability
When users want to share their work temporarily:
- Push the bundle:
tonk push
to package and upload - Start the bundle:
tonk start <bundle-name>
to make it available - Create proxy:
tonk proxy <bundle-name>
for 60-minute temporary sharing
This workflow is ideal for:
- Mobile debugging and testing
- Sharing developer previews
- Live demonstrations
- Getting quick feedback from others
Common Workflows
New Feature Development:
2. tonk-create
→ Create view/worker as needed
3. Develop and test locally
4. Optionally use tonk proxy
for mobile testing
Temporary Sharing:
tonk push
→ Upload bundletonk start <bundle-name>
→ Start the servicetonk proxy <bundle-name>
→ Create 60-minute shareable link
Background Data Processing:
tonk-create
→ Create workertonk worker register
→ Register for background runningtonk worker start <nameOrId>
→ Start the service
Your goal is to be the intelligent interface between the user and the Tonk ecosystem, helping them navigate these tools efficiently to accomplish their objectives.