Skip to main content

TypeScript Runtime

Besides the Python CLI, Cognitive Modules also provides a standalone TypeScript runtime cognitive-runtime.

Installation

npm install -g cognitive-runtime

CLI Usage

# Run module
cog run code-reviewer --args "your code" --pretty

# List modules
cog list

# Module info
cog info code-reviewer

# Validate module
cog validate code-reviewer --v22

Programmatic Usage

Basic

import { runModule } from 'cognitive-runtime';

const result = await runModule('code-reviewer', {
code: 'function add(a, b) { return a + b; }',
language: 'javascript'
});

console.log(result.meta.confidence);
console.log(result.data.issues);

With Configuration

import { CognitiveRuntime } from 'cognitive-runtime';

const runtime = new CognitiveRuntime({
modulesPath: './cognitive/modules',
provider: 'openai',
apiKey: process.env.OPENAI_API_KEY,
model: 'gpt-4o'
});

const result = await runtime.run('code-reviewer', { code: '...' });

Streaming

import { runModuleStream } from 'cognitive-runtime';

const stream = runModuleStream('code-reviewer', { code: '...' });

for await (const chunk of stream) {
process.stdout.write(chunk);
}

v2.2 Format Support

TypeScript runtime fully supports v2.2 format:

interface ModuleResult<T = any> {
ok: boolean;
meta: {
confidence: number;
risk: 'none' | 'low' | 'medium' | 'high';
explain: string;
};
data?: T & {
rationale: string;
extensions?: {
insights?: Array<{
text: string;
suggested_mapping: string;
}>;
};
};
error?: {
code: string;
message: string;
};
}

HTTP Server

# Start HTTP server
cog serve --port 8000

# With CORS
cog serve --port 8000 --cors

API Endpoints

# Run module
POST /api/run/:module
Content-Type: application/json
{
"code": "...",
"language": "python"
}

# List modules
GET /api/modules

# Module info
GET /api/modules/:module

MCP Server

# Start MCP server (for Claude Desktop, Cursor)
cog mcp

# With custom port
cog mcp --port 3000

Environment Variables

VariableDescription
LLM_PROVIDERProvider (openai/anthropic/ollama)
OPENAI_API_KEYOpenAI API key
ANTHROPIC_API_KEYAnthropic API key
LLM_MODELDefault model
COGNITIVE_MODULES_PATHCustom modules path

Comparison with Python

FeaturePython (cogn)TypeScript (cog)
CLI
HTTP Server
MCP Server
Async
Streaming
v2.2 Support