Skip to main content

Structured responses in GPT webhook functions

5 minutes read                                                                                                                         Advanced

Reliably get json responses from parse_via_chat_gpt and parse_via_gpt_vision webhook functions

Introduction

If the webhook call type is POST/GET then we expect the response from external apis to be json. But we are not doing this for the webhook type FUNCTION, since the response from the functions (especially LLMs) is not reliable such that they return json everytime correctly. Due to this we store the response of the function as a string. Due to this even if they are returning proper json we store it as a string, and we don't try to parse it.

Recently openAI added a new feature called structured responses in which we can give the expected response schema in the request itself, thus we can have responses in reliable json formats. We are leveraging this and parsing the response, if its json then add the key-value pairs to the results variable.

Using structured responses in webhook

We have to add an optional param called response_format in the existing parse_via_gpt_vision/parse_via_chat_gpt webhook body.

The value of response_format will be the same as what openAI needs in their apis, you can read it here.

In short, if the value is {"type": "json_object"}, and the prompt has a json keyword, then the response will always be a json. Or if the value is {"type": "json_schema"}, then we have to pass the schema as mentioned in the openAI docs.

For both we will get a valid json from the openAI, and those will be merged into the results variable.

Examples

json_object

image

And the corresponding webhook response we get will be

image image

WARNING: If we are using json_object as response_format, then the response will be always json but there’s no reliability that the keys will be same always. For ex in the next run the key “Volunteer ID” can be “VolunteerID” unless states explicitly in the prompt.

json_schema

For the same prompt above the response_format would be something like

image image

The Pros of json_schema wrto json_object is that, the json keys will be always deterministic as specified in the schema.

Resources

https://openai.com/index/introducing-structured-outputs-in-the-api/ https://platform.openai.com/docs/api-reference/chat/create#chat-create-response_format