Loading post...
Loading post...

Every developer faces the same challenge: generating realistic test data. Libraries like Faker.js exist but they're static and limited. Ask ChatGPT to "generate 50 e-commerce orders" and you'll get nice JSON, but with zero consistency—a customer named "John Doe" might have an email like "alice@example.com".
I needed a tool that combines AI's flexibility with type safety, offers polyglot output (JSON → SQL → CSV), and follows a privacy-first approach. That's why I built Mockator.
Mockator = Mock + Generator
Three main goals:
MockatorProviderThe API route (/api/generate) acts as a simple proxy:
// User sends their key in the header
const provider = request.headers.get("x-provider"); // openai, anthropic, etc.
const apiKey = request.headers.get("x-api-key");
// Stream with AI SDK
const result = await streamText({
model: client(modelId),
system: systemPrompt, // "You are a Mock Data Generator..."
prompt: userPrompt,
temperature: 0.5,
});
return result.toTextStreamResponse();
Key point: No keys stored on the server. Direct pass-through to provider.
We always receive JSON from AI. When users click SQL or CSV tabs, we transform instantly in the browser:
// lib/transformers.ts
export function jsonToSql(json: any[], tableName = "mock_data"): string {
// INSERT INTO mock_data (id, name) VALUES (1, 'Alice');
}
export function jsonToCsv(json: any[]): string {
// id,name\n1,Alice
}
Advantage: No extra network requests, instant transformation.
// MockatorProvider
const [providerKeys, setProviderKeys] = useState<ProviderKeys>({});
// No useEffect, no localStorage!
// Keys only in memory, gone when tab closes
API keys are sensitive. We shouldn't persist them even in the browser. New approach:
useState (memory)Side-by-side split panels on desktop, vertical stack on mobile:
// Workbench.tsx
<div className="flex-1 overflow-hidden flex flex-col md:flex-row">
{/* Desktop: Horizontal ResizablePanel */}
<div className="hidden md:flex flex-1">
<ResizablePanel>Input</ResizablePanel>
<ResizableHandle />
<ResizablePanel>Output</ResizablePanel>
</div>
{/* Mobile: Vertical Stack */}
<div className="flex md:hidden flex-col">
<InputPanel />
<OutputPanel />
</div>
</div>
Tailwind's responsive classes (md:, sm:) make the layout automatically adapt.
Paste a TypeScript interface:
interface User {
id: number;
name: string;
email: string;
role: "admin" | "user";
}
AI understands it and generates type-safe data.
"Generate 10 e-commerce orders with customer name, total price, status, and date"
Boom. 10 rows of realistic data.
Thanks to Vercel AI SDK, data arrives chunk by chunk. Monaco Editor displays it live—like watching console.log in a terminal.
If one provider is expensive, switch to another:
Select from dropdown, enter key, ready to go.
mockator.com/s/e-commerce-v1.With streamText, you can use OpenAI, Anthropic, and Groq with the same API. Provider switching becomes trivial.
Initially, I thought "let's generate SQL on the backend." Then I realized:
Syntax highlighting, line numbers, copy/paste—all out-of-the-box. 5-minute setup with @monaco-editor/react.
git push
# Vercel auto-deploys
Edge runtime + ISR means fast globally. No cold starts.
Project on GitHub: github.com/berkinduz/mockator
Questions or feedback: berkinduz.com/en/about
Mockator is a great example of building practical AI tools. Not fancy, but solves real problems in daily workflow:
Next time you need test data, try Mockator instead of Faker.js. 🚀