Spaces:
Running
A newer version of the Gradio SDK is available:
6.1.0
title: FluidTools
sdk: gradio
emoji: π
colorFrom: blue
colorTo: purple
thumbnail: >-
https://cdn-uploads.huggingface.co/production/uploads/67f25e195870dd4f0f70a59f/SXTyGczeqFZgnhS4X-fwn.jpeg
short_description: Self-Driving Backends that Think, Talk, and Execute
sdk_version: 5.46.0
pinned: true
FluidTools:
AI-powered API multi-tool agent with multi-provider support (OpenAI, Anthropic, Ollama, Gemini, Nebius)
Hackathon Details
- Track -
building-mcp-track-enterprise - Link to Demo - FluidTools Demo
- Contributers
Overview
FluidTools is a powerful NPM package that transforms REST API collections (Postman 2.1 JSON format) into intelligent AI agent tools. Built with TypeScript, it integrates seamlessly into any Node.js/TypeScript server, enabling you to quickly add AI agents that can interact with your APIs using natural language queries.
Key Features
- π One-Click Tool Generation: Convert Postman collections to LangChain-compatible tools instantly
- π€ Multi-Provider AI Support: Compatible with OpenAI, Anthropic, Ollama, Gemini, and Nebius
- π§ LangGraph Integration: Robust agent orchestration with state management and memory
- π Semantic Tool Selection: Optional embedding-based tool filtering for large APIs
- β Human-in-Loop Security: Exact tool selection and user approval for sensitive operations
- π Multi-Language Support: Babel integration for international chatbot deployment
- π Server Agnostic: Integrates with any Express/Fastify/Koa server
- β‘ TypeScript First: Full type safety with Zod schemas
Installation
npm install fluidtools
Quick Start
1. Convert Postman Collection to Tools
npx fluidtools ./api.json ./tools.ts
Or programmatically:
import { postmanToLangChainCode } from "fluidtools";
const collection = JSON.parse(fs.readFileSync("./api.json", "utf-8"));
const code = postmanToLangChainCode(collection);
fs.writeFileSync("./tools.ts", code);
2. Create AI Agent Server
import express from "express";
import { FluidToolsClient, loadProviderConfigFromEnv } from "fluidtools";
import { generateTools } from "./tools.ts"; // Generated tools
const app = express();
app.use(express.json());
const providerConfig = loadProviderConfigFromEnv();
const fluidClient = new FluidToolsClient(
providerConfig,
generateTools,
"You are a helpful API assistant.",
10, // max tool calls
true // debug mode
);
app.get("/", async (req, res) => {
const { query } = req.query;
const { authorization } = req.headers;
const token = authorization?.split(" ")[1];
const response = await fluidClient.query(query, token);
res.send({ message: response });
});
app.listen(8000);
3. Query Your AI Agent
curl -X GET "http://localhost:8000/?query=Get user details and list their projects" \
-H "Authorization: Bearer YOUR_TOKEN"
Architecture
System Architecture Diagram
graph TD
A[Postman 2.1 JSON] --> B[CLI Tool<br/>fluidtools]
B --> C[Tool Generation<br/>TypeScript + Zod Schemas]
C --> D[FluidTools Client]
D --> E[Optional Embedding Service<br/>Semantic Tool Selection]
D --> F[System Prompt<br/>Custom Chatbots]
F --> G[LangGraph Agent<br/>Orchestration & Memory]
G --> H[Multi-Provider LLM Support]
H --> I[Multiple Model Support]
I --> J[Multi-Language Support<br/>Babel Integration]
J --> K[Server Integration<br/>Express/Fastify/Koa]
K --> L[API Exposed<br/>REST/WebSocket]
subgraph "π§ Tool Conversion Pipeline"
A
B
C
end
subgraph "π€ AI Agent Core"
D
F
G
H
I
J
end
subgraph "π Integration Layer"
K
L
end
subgraph "β‘ Security & Control"
M[Human-in-Loop<br/>Tool Confirmation]
N[Exact Tool Selection<br/>Security Controls]
end
G --> M
M --> N
subgraph "Provider Ecosystem"
O[OpenAI<br/>GPT-4, GPT-3.5]
P[Anthropic<br/>Claude 3.5, Opus]
Q[Ollama<br/>Local Models]
R[Gemini<br/>2.5 Flash, Pro]
S[Nebius<br/>Kimi-K2]
end
I --> O
I --> P
I --> Q
I --> R
I --> S
L --> T[Chatbot UI<br/>Gradio/React/Web]
System Architecture Overview
Postman Collection Processing
- Parses Postman 2.1 JSON format
- Extracts requests, parameters, bodies, and schemas
- Generates TypeScript tools with automatic Zod validation
Tool Generation Engine
- Converts each API endpoint into a LangChain tool
- Handles path variables, query parameters, headers
- Supports all HTTP methods (GET, POST, PUT, DELETE, PATCH)
- Auto-generates meaningful descriptions
Multi-Provider LLM Integration
- Unified interface for different AI providers
- Configurable model selection and API keys
- Consistent response formatting
LangGraph Orchestration
- Sequential tool execution with memory
- State persistence using checkpointer
- Built-in retry mechanisms and error handling
Optional Embedding Layer
- Semantic indexing of tool definitions
- Cosine similarity-based tool selection
- Reduces token usage for large toolsets
Server Integration
- Session-based conversation management
- Tool call confirmation system
- Rate limiting and authentication
Data Flow
Postman Collection JSON βββββββ
β
CLI Tool (fluidtools) βββββββββΌ
β
TypeScript Tool Code ββββββββββΌ
β
Express/Fastify Server ββββββββΌ
β
FluidTools Client βββββββββββββΌ
β
LangGraph Agent βββββββββββββββΌ
β
LLM Provider + Tools ββββββββββΌ
β
API Calls + Responses βββββββββΌ
β
User-Friendly Chat Response βββΌ
Demo 1: Gradio Integration (Public Testing)
Located in ./demo/server/, this demo provides a complete Express server with Gradio UI integration for testing your AI agents:
Features:
- Web upload interface for Postman collections
- Real-time chat with your AI agent
- Provider selection (OpenAI, Anthropic, etc.)
- Rate limiting for free tier testing
- Tool confirmation dialogs
- Session management
Backend Setup:
cd demo/server
npm install
npm start
Backend runs on http://localhost:3000
Frontend (Gradio UI):
cd demo/gradioServer
pip install -r requirements.txt
python app.py
Frontend runs on http://localhost:7860 - open this in your browser for the beautiful glassmorphic chat interface with drag-and-drop Postman collection upload and real-time AI chat.
Demo 2: Real-World Integration (Cloud API Example)
Located in ./demo2/backend/, this demo shows a production-ready integration with a cloud provider API:
Features:
- Pre-generated tools from Ace Cloud API
- Simplified server setup
- Custom system prompts
- Environment variable configuration
- Tool approval workflows
This demo converts a comprehensive cloud API (instances, volumes, networks, billing, etc.) into AI tools.
Backend Setup:
cd demo2/backend
npm install
npm run dev
Backend runs on http://localhost:8000
Frontend (React App):
cd demo2/frontend
npm install
npm run dev
Frontend runs on http://localhost:5173 - features a modern React chat interface with:
- π€ Voice input/output capabilities (STT/TTS)
- π± Responsive design with markdown rendering
- β Tool approval dialogs for sensitive operations
- π Real-time message streaming
- π¨ Beautiful UI with copy/retry functionality
- π§ Advanced chatbot features
The React app connects to the backend API to provide a complete user experience for interacting with your AI agent.
API Reference
FluidToolsClient
Main class for managing AI agents.
new FluidToolsClient(
providerConfig: ProviderConfig,
toolsGenerator: Function,
systemInstructions?: string,
maxToolCalls?: number,
debug?: boolean,
expireAfterSeconds?: number,
confirmationConfig?: ToolConfirmationConfig,
toolsConfig?: Record<string, any>,
embeddingConfig?: EmbeddingConfig
)
Key Methods
query(query: string, accessToken?: string): Execute natural language queryclearThread(accessToken?: string): Clear conversation memorygetPendingConfirmations(accessToken?: string): Check pending tool approvalsapproveToolCall(toolCallId: string, accessToken?: string): Approve pending toolrejectToolCall(toolCallId: string, accessToken?: string): Reject pending tool
Provider Configuration
// Environment Variables
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
OLLAMA_BASE_URL=http://localhost:11434
// Or programmatic
const config = {
provider: "openai",
model: "gpt-4",
apiKey: process.env.OPENAI_API_KEY,
temperature: 0.1
};
CLI Usage
Generate tools from Postman collection:
fluidtools <input-file> [output-file] [--help]
# Examples
fluidtools api.json tools.ts
fluidtools ./collections/my-api.json
Contributing
- Fork the repository
- Create feature branch:
git checkout -b feature/amazing-feature - Commit changes:
git commit -m 'Add amazing feature' - Push to branch:
git push origin feature/amazing-feature - Open a Pull Request
License
ISC
Contributors
We'd like to thank all the amazing people who have contributed to FluidTools! π₯
- KartikJoshiUK - Creator & Lead Developer
- Jatin Godnani - Core Contributor
Sponsors
Support
- π Documentation: GitHub Wiki
- π Issues: GitHub Issues
- π¬ Discussions: GitHub Discussions
Built with β€οΈ for developers who want AI-powered API interactions