ChatGPT using OpenAI APIs
Using ChatGPT within Glific using the OpenAI APIs
Leverage the capabilities of GPT models developed by OpenAI by using OpenAI APIs.
| 3 minutes read | Level: Advanced | Last Updated: December 2025 |
How it will work
- Question is asked by the user after the flow is initiated
- OpenAI API call is made using webhooks within the flow
- Question, prompt to the model, and the type of model being called to answer the question or perform the task is provided inside the webhook params
- The response is returned from the GPT models which can be sent to the user directly or used as desired in the flow.
Using the webhook for OpenAI API call in a Glific flow
Representative image to explain the steps in OpenAI API calls in a simple flow
- Get the user question
- Add a
call a webhooknode.
- By default,
FUNCTIONwould be selected. Leave this as it is.
- In the
FUNCTIONfield, select the pre-defined functionparse_via_chat_gptfrom the dropdown.
- Give the webhook result name - you can use any name. In the screenshot example, it’s named
gpt_response.
- Add the parameters in the
FUNCTION Body.
- Click on
Function Bodyon the top right corner. You would see the following.
- Pass the following parameters as mentioned below.
{ "question_text": "@results.question_1", "gpt_model":"gpt-4o", "prompt":"Answer in less than 5 sentences." }
here question_text is the parameter name corresponding to user question.
gpt_model is the parameter to help you select the best model for performing the given task. The model name must correspond to the text models given in the OpenAI API documentation. See OpenAI models documentation
- The response from GPT is shown as
@results.webhookresultname.parsed_msg, in the given examplegpt_responseis the webhook result name. (see the first image)
Reach out to the Glific team to flag any further customizations within this functionality