Worksona Chat: An Agent Orchestration Platform for Complex Work
#worksona#portfolio#ai-agents#mcp#multi-model#chat-sdk
David OlssonMost AI chat products are a single conversational model wrapped in a UI. Worksona Chat is something structurally different: an agent orchestration platform where 13+ specialized agents collaborate on multi-step tasks, each contributing domain expertise to a shared workspace.
What It Is
Worksona Chat provides a dual-pane interface: conversation on the left, live artifacts on the right. Users select from domain-specific agents across three categories (boss, business, and software), attach documents, write code, generate visualizations, and work with data — all without leaving the session.
The platform supports Anthropic Claude, OpenAI GPT, Google Gemini, and xAI Grok through a unified provider abstraction. Switching models mid-conversation is supported. Chat history, uploaded documents, and generated artifacts persist in Neon Postgres across sessions.
Alongside the AI orchestration layer, the webchat component adds real-time peer-to-peer communication via WebRTC — with video and audio routed directly between participants, not through a central media server. Speech-to-text transcription surfaces as regular chat messages in the shared timeline, keeping spoken and written communication in one unified interface.
Why It Matters
Traditional AI assistants hand back a single response to a single question. Complex knowledge work — analyzing a dataset, producing a strategy document, building a prototype — requires sequential reasoning across multiple steps with different kinds of expertise at each stage.
Worksona Chat addresses this by separating what agents are (JSON definitions in agentsX/) from how they think (Markdown protocol documents in protocols/). An agent's identity stays fixed; its cognitive approach is loadable at runtime. A business strategy agent can reason through a critical-thinking protocol without becoming a different agent.
For organizations running document-heavy workflows, the practical effect is that PDFs, Word documents, Excel sheets, and images can all be uploaded and interrogated in the same session where code gets written and executed.
The webchat component addresses a different but related problem: organizational dependency on third-party conferencing platforms. Because WebRTC routes media peer-to-peer and the server handles signaling only, video and audio content never passes through the server. Organizations that need to control their communication infrastructure can self-host the signaling layer.
How It Works
flowchart TD
User["User (browser)"]
UI["Next.js UI\n(conversation + artifact panel)"]
Agents["Agent System\n(13+ JSON definitions)"]
Protocols["Protocols\n(Markdown cognitive frameworks)"]
Skills["Agent Skills\n(16+ agentskills.io modules)"]
MCP["MCP Layer\n(external tools via Model Context Protocol)"]
Providers["AI Providers\nAnthropic · OpenAI · Google · xAI"]
DB["Neon Postgres\n(history, documents, artifacts)"]
WebRTC["WebRTC Signaling\n(peer-to-peer media)"]
User --> UI
UI --> Agents
Agents --> Protocols
Agents --> Skills
Agents --> MCP
Agents --> Providers
UI --> DB
User --> WebRTC
WebRTC --> |"signaling only\nmedia is peer-to-peer"| WebRTC
Agents as data. No code changes are required to add, modify, or remove an agent. The roster lives in JSON files, making it reviewable through ordinary data diffs.
Protocols as runtime cognitive stacks. Protocol files compile to a metadata index at build time so runtime lookup stays fast as the protocol library grows.
agentskills.io skills. The 16+ modular skills follow a public open specification, making them portable across any agent system that implements it.
MCP as integration substrate. External tools connect via Model Context Protocol through dedicated API routes. Any MCP-compatible server integrates without modifying core tool-handling logic.
Resumable streaming. AI response streams survive network interruptions and can be resumed, using the resumable-stream package alongside Next.js after() for stream lifecycle management.
Where It Fits in Worksona
Worksona Chat is the foundational platform in the Worksona portfolio. The patterns it establishes — agent-as-data definitions, protocol-driven reasoning, agentskills.io skills, and MCP as integration substrate — are the architectural inheritance available to every other project in the portfolio.
For organizations, the deployment path is Vercel (one-click) or any Node.js 18.18+ server. The entire capability set — multi-agent orchestration, multi-model support, document processing, code execution, artifact rendering, persistent history, and user entitlements — deploys as a single application.
The goal is not to replace existing tools. It is to reduce the number of tools a knowledge worker needs to switch between when the work is complex.
Live: chat.worksona.io