Laminar - open-source all-in-one platform for engineering AI products. Create data flywheel for your AI app. Traces, Evals, Datasets, Labels. YC S24.
-
Updated
Dec 29, 2025 - TypeScript
Laminar - open-source all-in-one platform for engineering AI products. Create data flywheel for your AI app. Traces, Evals, Datasets, Labels. YC S24.
FlowLLM: Simplifying LLM-based HTTP/MCP Service Development
Supercharge qwen-code with hybrid prompt-chaining, powered by gemini-cli integration. Performance improvements include 72% faster execution, 36-83% token efficiency gains, and 91.7% success rate verified across 5 repository benchmarks.
Document-first CLI for building agent workflows in Markdown and JSON.
ISON (Interchange Simple Object Notation) is a text format that is completely language independent but represents data in a way that maximizes token efficiency and minimizes cognitive load for AI systems. These properties make ISON an ideal data interchange format for Agentic AI and LLM workflows.
LLM Framework for LLMs
Prompt-based E2E workflow system for Claude Code
Force any OpenAI-compatible tool (Aider, Fabric, Interpreter) to use Gemini, Groq, Cerebras, or Ollama. Pure Bash. Zero dependencies.
CLI tool for LLM prompt pipelines. Reusable. Shareable. Scriptable.
Calibrate LLM responses for high-agency power users, prioritizing rigorous analysis and executive control. Overrides the default, RLHF-driven tendency toward immediate, ungrounded solutions that serve lower-agency 'LLM as magic tool' workflows.
使用 GPT 建立自訂角色、分工生成與合成審稿流程的範例。Example of GPT-driven role-based prompt orchestration and synthesis workflow.
An open-source Node.js RESTful backend API server designed to manage and execute complex workflows with AI and human-in-the-loop capabilities.
Project structure and workflow for AI-assisted software development.
Sovereign Context Protocol - A human-first workflow for managing LLM memory. Not an MCP server.
AI guardrails and personas that can assist in improving coding production and ensuring consistency
Add a description, image, and links to the llm-workflow topic page so that developers can more easily learn about it.
To associate your repository with the llm-workflow topic, visit your repo's landing page and select "manage topics."