Back to Blog
EngineeringTypeScriptBest Practices

Taming the Beast: Forcing JSON out of LLMs

6 min read

The Deterministic Nightmare

Software engineering relies on contracts. If an API promises a userId as a string, it better provide a string. LLMs, however, are probabilistic. They might give you JSON one time, and a polite apology the next.

The Old Way: Prompt Begging

Developers used to write prompts like: "Please return only JSON. Do not say 'Here is the code'. Just the JSON."

This is fragile. It breaks 5% of the time, and 5% error rates are unacceptable in enterprise.

The New Way: Tool Definitions & Zod

We strictly enforce schemas using Zod and "Function Calling" (or OpenAI's JSON mode).


const UserSchema = z.object({
  name: z.string(),
  age: z.number(),
  interests: z.array(z.string())
});

// We force the model to adhere to this schema
const result = await generateObject({
  model: openai('gpt-4-turbo'),
  schema: UserSchema,
  prompt: 'Generate a persona for a gamer.',
});
      

This approach guarantees that result.object matches your TypeScript interfaces perfectly. If the model hallucinates a field, the validation layer catches it and can even automatically ask the model to fix it (Reflection).

Ready to build for the future?

We help ambitious companies like yours build scalable AI agents and modern web platforms.