Get your App talking to ChatGPT in JSON in under 3 minutes

⚡️Hudson Ⓜ️endes
5 min readAug 4, 2023

LLMs read and write plain text. But you can get them to output JSON. Learn how in under 3 minutes, with runnable code examples in Python, Java!

DreamStudio. Prompt: artificial brain interacting with software system, minimalistic, zen, meditational, volumetric light, masterpiece, abstract art

This article inspired by the Free DeepLearning.AI Course on Prompt Engineering by Andrew Ng and his team.

1. Beyond ChatGPT Web UI

Although the ChatGPT Web UI is widely known, it isn’t the only way to use GPT-3.5 or GPT-4. OpenAI provides an API that allows programmatic interaction with these large language models (LLMs). In essence, LLMs are all about completing text. You “prompt” the LLM with context, it generates the next token or tokens for you. This technique, known as prompt engineering, is the fundamental way to interact with all LLMs.

2. The Problem

Mixing code and natural language may seem like trying to blend oil and water. The structure of information in natural language can be ambiguous or abstract, while code requires information to be structured meticulously. Most interactions with external software, such as APIs, involve exchanging data in formats like JSON or XML.

Unfortunately, Chat Completion APIs return plain text. If LLMs could return JSON, it would open a new world of possibilities… But wait, they can! And it’s simple!

--

--